The World Food Programme recently announced a partnership with Palantir. There’s a lot to be concerned about here - in this specific instance and in the broadest possible sense for understanding civil society’s role regarding digital data use. ...
 

PHILANTHROPY 2173
The future of good

 

you. Click here to start your FREE subscription, here is the latest information on the business of giving
 



PHILANTHROPY 2173 - 5 new articles

  

Responsible data in civil society

The World Food Programme recently announced a partnership with Palantir. There’s a lot to be concerned about here - in this specific instance and in the broadest possible sense for understanding civil society’s role regarding digital data use. Please read this open letter

https://responsibledata.io/2019/02/08/open-letter-to-wfp-re-palantir-agreement/


Does your organization know how the tech companies you partner with use the data you have on your constituents? It’s not just about “ownership,” but the insights and algorithms and predictive tools that may be built on that data. Are you “locked in” to a platform - if you wanted to switch vendors for your CRM or mailing lists or social media - can your organization get your data back?

How is your organization managing data? With any more respect for the individual people from whom it comes than these big software/analytics/insight companies? If not, why should anyone trust you with their data?

These are pivotal questions - and we need answers and processes and protocols and regulation. Civil society is not meant to be either a poor cousin of big business, an outsource arm of government, or the “data washing” arm of either.

     



The year (and decade) ahead

The tenth annual Blueprint - Blueprint 2019 - went live in December. You can find it and the entire decade's archive here.

On January 23rd we'll be hosting our annual "prediction-palooza" (free, online) discussion about philanthropy predictions. Information on that is available here.
In the meantime, I've just come off a conversation with a research group preparing a report for a big foundation on the future of philanthropy. I get asked to do a lot of these. I only agree to these conversations if there is going to be a public version of the report. I'm told that's the case - this report should be available in April.

Some thoughts as I was asked to reflect on the last 5 years and then look ahead 10 years.

Looking back: 
All the new products, platforms and ways to give (DAFs, impact investing, crowdfunding platforms, text donations, cause marketing, etc.) are not adding up to more giving by more people. As Indiana University notes, since 2000 we've lost 20 million givers - at least as recorded by data on charitable donations to tax exempt nonprofits. This is over the same 19 year time frame that brought us online/mobile donations, online data resources about nonprofits,

-> Perhaps we can stop assuming product innovation equals growth?

Where have the "missing millions" gone? I doubt people have given up on giving, after all we've been doing it for thousands of years. I think we have an "instrumentation" problem. Which is to say, we're measuring the wrong things. Changes in the tax code are expected to result in changes in giving to nonprofits (and to changes in how useful itemized tax forms will be for measuring giving behavior).

--> Perhaps we can ask whether we're measuring the right things?
We need new instruments to measure how Americans give and to whom. It should include measurements of person-to-person giving (e.g., as happens on crowdfunding platforms), political donations and contributions of time, investments intended to produce social/environmental outcomes, and money raised via product purchases (BOGO or cause marketing). I've been calling for this since at least 2015 - see here, and had intimations about it back in 2008 (see here).

Looking Ahead:

Commercial platforms and nonprofit data:
Does anyone really think Facebook is going to be the same thing in 2029 that it is in 2019? Not even the folks at FB would say that. Every nonprofit and philanthropist that is managing their giving, their outreach, their donors, etc. on this single platform should beware. The rules are going to change, the compan(ies) will push back, there will be all kinds of ups/downs between now and 10 years from now - but in no imaginable future is doing right by nonprofits (in terms of their data and longterm relationships with donors) in the growth plans of that company. If you can imagine either a different FB in 10 years or no FB 10 years from now, then it seems like a good idea not to put all your data eggs in a FB basket. (Or any commercial platform driven by a business model unrelated to how well it serves the social sector).

Charity and Politics:
The legal line between these two domains is man-made. It's changed over time. It's thin and it's fragile. The current presidential administration is determined to destroy it (see Johnson Amendment fight, as well as court cases on donor disclosure in politics). There's not enough manpower at the IRS to defend any boundary that might exist. Digital data and systems make defending the line even more difficult than they were in the analog age. Many advocates and advocacy organizations would like to see the line gone. Individual people may not care as much about separating political and charitable action as policy makers and nonprofit leaders want them to. Assuming the old boundaries between these domains function as intended is fooling oneself. We should put our attention into writing rules that protect our (often conflicting) rights (association, expression, privacy), sheds light on political corruption and the excessive influence of wealth, and assumes digital data collection and exposure, rather than nostalgically assuming that protecting a legal distinction established in the 1950s is the best (or only) way forward.

Shiny Objects
Anyone but me noticing that the marketing hype about blockchain is starting to quiet down, just as people building digital systems that focus on digital security and encryption are growing in actual use? This is a good thing - stop gawking at the new packaging and let's focus on the values that particular activities require. In some cases, permanent records of transactions are a good thing (supply chain verification of objects, possibly). In other cases, distributed, immutable records may not be such a good idea (human ID, for example).

Artificial intelligence (AI), predictive algorithms, and machine learning are three more shiny objects. Most people ask me "How will nonprofits and/or philanthropists change by using AI?" I think this question has the subject and object in wrong order. The more relevant question for most donors and social sector organizations is "How will the use of AI (etc.) change what the organizations/donors are doing?" Government bodies and commercial companies are already using these tools - they shape what you see online, what benefits you qualify for, your chances of being audited by the tax authority, your chances of getting a speeding ticket, the keywords you need to enter in your job application, etc. etc. They are changing the nature of the problem space in which social sector organizations and philanthropists do their work. This is not the future, this is now. This is not the edge case exceptions of a few organizations wih some good data and data scientists, this is the great mass of organizations and donors. I'd love to see some real discussion of how philanthropy and social sector organizations can and should change to be effective in a world already being shaped by AI (etc.). Then, for dessert, we can talk about the exceptions to this rule.

It's the nature of the game that we'll chatter about the latest shiny object. What's much more interesting is how we embody shared values in new ways.



     



Media Manipulation and Giving Tuesday

Like many of you I woke up this morning to an email inbox full of leftover Black Friday ads, a whole bunch of Cyber Monday ads, and the Xth day in a row of #GivingTuesday announcements.

Among those was the first clearly-designed-to-misinform #GivingTuesday astroturf email that I've received.

It came from the Center for Consumer Freedom (CCF) - a nonprofit front group run by a lobbyist for a variety of industries including restaurants, alcohol, and tobacco. The umbrella group for CCF - the Center for Organizational Research and Education (CORE) - is also home to HumaneWatch. According to the 2016 990 tax filing for CORE, HumaneWatch exists to "educate the public about the Humane Society of the United States (HSUS), its misleading fundraising practices, its dismal track record of supporting pet shelters and its support of a radical animal rights agenda."

(clip from 2016 990 for CORE)

The email I received from CCF linked to a YouTube "ad." But all of it - the website consumer freedom, the email I received, the work of these nonprofits - all lead back to a commercial PR firm Berman and Co, which has been accused of setting up these groups as part of their paid work for industry. None of this was revealed in the email - and if you look at the website for CCF to find out who funds it you find this statement:
"The Center for Consumer Freedom is supported by restaurants, food companies and thousands of individual consumers. From farm to fork, from urban to rural, our friends and supporters include businesses, their employees, and their customers. The Center is a nonprofit 501(c)(3) organization. We file regular statements with the Internal Revenue Service, which are open to public inspection. Many of the companies and individuals who support the Center financially have indicated that they want anonymity as contributors. They are reasonably apprehensive about privacy and safety in light of the violence and other forms of aggression some activists have adopted as a “game plan” to impose their views, so we respect their wishes."
If you check the CCF's 990 form (Search under CORE) you'll find that on revenue of $4.5 million (sources undisclosed), the largest expense was $1.5 million paid to Berman and Co, for management fees. Next largest expense is $1.4 million spent on advertising and promotion.

There's no virtue in this circle - just paid lobbyists setting up nonprofit groups to counter the messages of other nonprofit groups. On the one hand, the nonprofit sector must be doing something right when the tobacco and alcoholic beverage industries are trying to shut them up. On the other hand, good luck to you - average donor - trying to figure out what's real and what's not. Even the watchdog groups are sniping at each other

I've written before about misinformation, the current ecosystem of distrust, and civil society. And here it is. Be careful out there.
     



Verify the data

Three tweets from yesterday:




Depending on a commercial company for our giving infrastructure is problematic in several ways. First, at any point in time the company (and this company has done this repeatedly) can change it's commitment, algorithm, priorities and leave everyone who was using it without recourse. Second, we have no way of knowing that the company's algorithms are offering all the choices to all the people. How would you even know if your nonprofit or fundraising campaign wasn't being shown to those you were trying to reach? Third, Facebook owns this data and can tell us whatever they want about it. Maybe $1 billion was given, maybe it was more, maybe it was less - how would we know?

There's an existing infrastructure for measuring giving in the U.S. and a number of research centers that analyze and report on those trends every year. That infrastructure - from 990 tax forms to The Foundation Center, Guidestar, the Urban Institute, and independent research from Giving Institute or the Lilly School at Indiana U - was built for the purpose of public accountability, to protect the democratic values of free association and expression, and for industry-wide performance improvement. This infrastructure is not perfect. But the data they use and their analytic methods can be checked by others - they can be replicated and verified following the basic tenets of sound scientific practice and good evidence practices for policymaking.

 There needs to be new ways to understand what's happening on these proprietary platforms - especially if Facebook is moving $1 billion and GoFundMe $5 billion. Those are big numbers about our nonprofit sector. We need to be able to interpret these data, not just reflexively believe what the companies announce.
     


Flipping assumptions about algorithms

I've had countless conversations with well-intended people from a number of social sectors and academic disciplines who are working on digital innovations that they firmly believe can be used to address shared social challenges. Some of these approaches - such as ways to use aggregated public data - are big investments in unproven hypotheses, namely that making use of this data resources will improve public service delivery.

When I ask these folks for evidence to support their hypothesis, they look at me funny. I get it, their underlying hypothesis that better use of information will lead to better outcomes seems so straightforward, why would anyone ask for evidence? In fact, this assumption is so widespread we're not only not questioning it, we're ignoring countervailing evidence.

Because there is plenty of evidence that algorithmically-driven policies and enterprise innovations are exacerbating social harms such as discrimination and inequity.We are surrounded by evidence of the social harms that automated decision making tools exacerbate - from the ways social media outlets are being used to the application of predictive technologies to policing and education. Policy innovators, software coders, data collectors need to assume that any automated tool applied to an already unjust system will exacerbate the injustices, not magically overcome these systemic problems.

We need to flip our assumptions about applying data and digital analysis to social problems. There's no excuse for continuing to act like inserting software into a broken system will fix the system, it's more likely to break it even further.

Rather than assume algorithms will produce better outcomes and hope they don't accelerate discrimination we should assume they will be discriminatory and inequitable UNLESS designed specifically to redress these issues. This means different software code, different data sets, and simultaneous attention to structures for redress, remediation, and revision. Then, and only then, should we implement and evaluate whether the algorithmic approach can help improve whatever service area they're designed for (housing costs, educational outcomes, environmental justice, transportation access, etc.)

In other words, every innovation for public (all?) services should be designed for the real world - which is one in which power dynamics, prejudices, and inequities are part of the system into which the algorithms will be introduced. This assumption should inform how the software itself is written (with measures in place to check for and remediate biases and amplification of them) as well as the structural guardrails surrounding the data and software. By this I mean implementing new organizational processes to monitor the discriminatory and harmful ways the software is working and the implementing systems for revision, remediation and redress. If these social and organizational can't be built, then the technological innovation shouldn't be used - if it exacerbates inequity, it's not a social improvement.

Better design of our software for social problems involves factoring in the existing systemic and structural biases and directly seeking to redress them, rather than assuming that an analytic toolset on its own will produce more just outcomes. There is no "clean room" for social innovation - it takes place in the inequitable, unfair, discriminatory world of real people. No algorithm, machine learning application, or policy innovation on its own will counter that system and its past time to keep pretending they will. It's time to stop being sorry for or surprised by the ways our digital data-driven tools aren't improving social challenges, and start designing them in such a way that they stand a chance.
     


More Recent Articles


 

You Might Like

Safely Unsubscribe ArchivesPreferencesContactSubscribePrivacy