I posted this reflection over on DigitalImpact.org - regular readers of the Blueprint - send me your notes!
For those who don't want to click over (and you should) the piece discusses the technological work being done on digital identities - where you would control yours - and its implications for civil society and philanthropy. Go on, read it.
One of many things that have been made more public during this week's congressional hearings with Mark Zuckerberg is the way in which the platform curates content. Zuckerberg bemoaned the reality that it's his job to decide who sees what when.
For those who study curation and platforms and internet law this is not new. I'm writing this while listening to Tarleton Gillespie discuss his forthcoming book (recommended) Custodians of the Internet. He's describing the rules, technologies, and people that make up the "moderation apparatus" - the systems that determine who sees what information, when, and from whom. Gillespies argues that this moderation is essential to what the platforms do - it is their value proposition. This runs counter to the longstanding mythos of the open web.
One of the elements of this "moderation apparatus" that Gillespie describes that catches my eye is the role of civil society organizations and nonprofits. Big companies, like Facebook but probably not only Facebook, rely on civil society to do their dirty work.
In Myanmar, civil society groups that were working with Facebook to take down hateful and violent postings pushed back when Zuckerberg claimed that the company was doing all it could to address these issues. The civil society groups noted that the company was essentially relying on them to voluntarily moderate the site and wasn't providing them with the engineering resources that were needed to do this. They secured a verbal commitment from Zuckerberg to improve the process.
Here's what this means:
Here's what I want to know:
- Facebook was shifting its responsibilities to civil society.
- Civil society groups aren't equipped for, or paid for, this role.
- Civil society groups - by design - are fragmented and contentious. Choosing some of them to do moderation is a value-laden, editorial decision.
- Civil society is - from Facebook's perspective in this example - just a low cost, outsourced labor source. It also, no doubt, shifts liability from Facebook to civil society (not least for the human psychological effects of moderating photos and posts about harm and violence).
Seems to me this starts to elicit some really interesting questions about role/relationship of nonprofits, companies and government in digital space.
- How widespread are these kinds of commercial/civil society moderation/curation relationships?
- How do they work - who's contracted for what? who's liable for what? what recourse exists when things go wrong?
- What do civil society groups think of this? When might it be a good solution, from civil society's perspective?
- Some civil society groups - such as Muslim Advocates and Color Of Change - are calling for a civil rights audit of Facebook. Senator Cory Booker took this idea into the hearings. This sort of advocacy and accountability demands of the platforms makes more sense to me as the role of civil society - not doing the work, but demanding the work be done. Your thoughts?
This article from India Development Review captures some of my thoughts on civil society and digital data.
Remember when philanthropy, foundations, and nonprofits were unknown? Boy, has that changed - they now play regular roles in news and literature.
- Senator Patrick Leahy asked Mark Zuckerberg why Facebook had to hear from civil society groups before taking action against violent crimes in Myanmar
- (editor: Why didn't Leahy also ask Zuckerberg about Facebook's labor exploitation of those groups' volunteers - essentially relying on them as his workforce?)
I guess not all press is good press.
- Special Counsel Robert Mueller and the FBI are investigating the President's attorney for foreign payments to Trump's foundation.
- Meg Wolitzer's new novel features a protagonist who works at a foundation. A review of the novel in Bookforum includes this wonderful line:
- "...it takes an earnest but compromised nonprofit endeavor as a vehicle for its ideas. With its magical relationship to money, the foundation helps insulate Greer and her beliefs from the world beyond, at least until she must confront the reality of what the suits are doing upstairs"
- Jonathan Franzen's 2010 novel, Freedom, featured a bird rescue nonprofit.
(originally posted on DigitalImpact.org)
Have you noticed an uptick of emails from companies like Slack, Google, or PayPal, announcing new privacy policies and terms and conditions? Why the sudden onslaught of updates? The answer is easy. The companies sending these notices are changing their policies to meet the requirements of the European Union’s General Data Protection Regulation (EU GDPR or just GDPR), which will put powerful new enforcement mechanisms into place, starting on May 25, 2018.
If you’re a U.S. resident, or working at a U.S. nonprofit or foundation you may wonder what, if anything, the GDPR has to do with you? Good question. There’s no simple answer for everyone outside the EU. But just as those companies (all of which are based in the U.S.) revisit their policies and practices because of the new law, it’s a good idea for you to do so, too.
First, the GDPR probably applies to you, whether you know it or not. It’s possible – depending on where your clients and donors live, where your data is stored, or where you provide services – that your organization is subject to fines for not following the new law. In this case, compliance is more than just a good idea, it’s required.
Second, the GDPR is a prompt for a worldwide checkup on safe, ethical, and effective data practices. Many of the GDPR’s provisions align with the data governance principles and responsible data practices that we at Digital Impact advocate for in civil society. Think of the GDPR as providing a framework and set of user-centered guidelines about data that may just align with your mission.
Many resources and consultancies are popping up to help organizations comply with the GDPR.
Digital Impact is here to help you navigate through it. We’re on the lookout for credible, accessible, and affordable resources with particular resonance to nonprofits, foundations, and civil society. In the coming months with help from our community, we’ll be curating new content, holding conversations about data governance and GDPR, and fostering discussion at digitalimpact.org/gdpr.
Check out our starting list of GDPR resources, send us others that you’ve found, and join the community in conversation. Want to share your view on the GDPR with the world? Become a Digital Impact contributor. And if there are topics, tools, or templates you need but can’t find, let us know. Maybe the Digital Impact community can help.