One of the theses of the Digital Civil Society Lab is that digital policies matter to civil society. We've been working since 2013 to map and understand the intersections of laws and regulations on telecommunications, intellectual property, consumer privacy, digital rights and liberties, free speech, and privacy with laws on nongovernmental organizations, nonprofits, and philanthropy (in the U.S. and 9 other countries around the world).
We want to understand these domains and their intersections to
All of our work is geared toward making space - literally, figuratively, legally, and technologically - for civil society when our digital spaces are owned by corporations and overseen by governments. We're trying to create and protect park benches on the internet where people can meet, talk, and organize.
- inform our theoretical understanding of digital civil society,
- identify partners and allies around the globe working on related issues, and
- connect "digital" and "civil society" advocacates and researchers to each other.
I'm about halfway through Jenifer Granick's book, American Spies, and I find myself thinking that maybe all of the above has just become a small subset of surveillance activities. The growth of the surveillance state, its transnational capacity, and the ties between state and corporate actors are so extensive that perhaps we've already lost any digital space in which we can have private conversations. If this is true than there is no room for association beyond the purview of the state. This is troubling. Civil society depends on this associational space being widely available (and not just to the elite few who can pay for or hack their way to privacy) and democracies depend on civil society.
I will march, protest, call my representatives, vote, mobilize, and resist. As of Jan 3, 2017 the U.S. government - House, Senate and soon to be White House - is taking a broad swath of actions that I do not support and will not allow to happen in my name. I will do everything I can to let elected officials know that, to resist their actions, and to work toward democratic representation at the federal level that mirrors the votes and political demands of the majority of U.S. voters.
But I am not going to do this digitally. I can't.*
Why, you ask? Aren't you, Lucy, getting emails and tweets and text messages galore about petitions to sign, groups to join, emails to send, and hashtags to use.
Yes, I am. More than I can count.
And the vast majority of them want me to sign up, to send them my friends' email addresses and my cell numbers or follow them on Facebook to learn more and participate. I won't do it.
First of all, I don't use Facebook. Second, while there's good reason to believe that many of these requests and calls to action are coming from legitimate groups, whose missions I support, and to whom I might give my (but not ever my friends') contact information, there's also good reason to assume otherwise. The otherwise takes at least two forms 1) the legitimate nonprofit or political group is using third party software to collect my name and cell number, and that software company is going to package up my personal info. Sure, they'll sell it somewhere. But, more important, I know they'll hand it over when the government asks for it and there's nothing I can do about it or 2) the whole thing is just an email/cell phone farming exercise wrapped in the guise of issues I care about.
It's not just that I don't want commercial companies holding all that information on me. I am working to resist the policies of my government. The U.S. government has access to all of that information once it's online. Yes, I will hit the streets to protest. But I don't plan to call the police or immigration services or Donald Trump and tell him my plans, where I will be when, and with whom. And I don't intend to do the digital version of that and hand the very forces I'm resisting the equivalent of that information in fine-grained digital form.
If you want to know how to deal with this reality regarding your own data and ability to take action then I suggest reading Dragnet Nation, everything else Julia Angwin has ever written for ProPublica, using the materials from EFF's Surveillance Self Defense, and checking out this blog post that points you to other wonderful tools for being smarter about your digital self. Take a training, ask an engineer, attend a cryptoparty, ask a librarian, find another way.
For the political groups, the coalitions and nonprofits, the march organizers and the rally folks - your job is just as important. Don't make me vulnerable to digital enclosure - give me options I can trust in order to work with you. Are you using Facebook for all your outreach? Then count me out.
The challenges are numerous and the questions are tough. Some answers exist - check out digitalIMPACT.io and help get more answers and more tools to more people and organizations, sooner. We need this. These data threats may well be the biggest risk civil society and independent nonprofits now face. What's your digital risk mitigation strategy?
Yes, we can use digital tools to help us protest and resist, to organize our communities, to make good philanthropic investments, and to reestablish a democratic government that represents the majority of voters. But first, we need to design and use digital strategies and data models that align with our democratic and philanthropic missions.
*Yes, I get the irony of blogging this on software owned by Google. Think about what info I've shared here and what I haven't.
The instructions "do not fold, spindle or mutilate" used to accompany IBM punch cards, a ubiquitous technology for capturing and storing data for computational purposes up until the late 1980s.
As colleges and universities began to computerize their student records many people experienced feeling like they'd become just a number, just some data in a big machine. Of course, many people - whole demographic groups - had long been familiar with this phenomenon. Some of our worst moments in history include government/business alliances that used "data on people" for a variety of harmful reasons. A single century provides examples from the passbook requirements for Blacks in apartheid South Africa, to the stars on Jews in Nazi Germany, internment camps for Japanese Americans in World War Two, and government files on American citizens during the McCarthy era and the civil rights movement.
Data on people can be used for good (improving health care, educational opportunities, tracking environmental refugees, enfranchisement, targeted advertising) or evil (discrimination, elimination, disenfranchisement, targeted advertising).
The Free Speech movement of the 1960s co-opted the instructions to "not fold, spindle or mutilate" to apply to the humans captured in the data, not just the punch cards.
Nowadays, we are (or should be) aware that both businesses and governments are collecting data on us in ways so pervasive and passive as to make punch cards seem quaint. We also know that we have been complicit in making our data available freely - often in exchange for search functionality, social media connections, retail discounts, or two day free shipping.
Given this knowledge, people who are preparing to work with data - in any capacity - need to think about the ethics of what they're doing. This last week saw the rise of the NeverTech manifesto - in which tech company employees from across the spectrum vowed not to help build Donald Trump's muslim registry (#NeverAgain.Tech) Other tech executives are signing on to commitments to civil liberties. These statements are important, but, really, they are more of a floor than an aspirational ceiling. Refusing to participate in building tools to facilitate discrimination that defy the very principles of religious liberty on which the U.S. was founded 200+ years ago hardly lives up to technologists' self-image of disruptive, risk-taking, future creators.
The generation of digital tools on which we now depend - social media, search, mobile - as long as they continue to destroy our ability to speak freely, to assemble peacably, and to learn, think and act privately are neither innovative nor groundbreaking. They are lazy first generation solutions, avoiding the tough issues of personal agency, liberty, privacy, and civil rights.
We the people who are the digital data, who are excited about its possibilities, and who are dedicated to taking advantage of it must be the ones to dismantle liberty-destroying pervasive surveilled networks and unaccountable third-party landgrabs over our digital selves. We must be the ones to fight for encryption as a fundamental bulwark of civil society, to take on the difficult engineering tasks that encode and protect personal privacy in pursuit of public benefit, and to invent digital systems that align with and extend humanity's highest aspirations for life and liberty.
We need bold action now to make the digital realm align with the principles of justice, freedom, individual action and collective good that centuries of humans have fought to codify in our most principled democracies. To give up on the former is to destroy the latter.
To defer to decades-old business models, special-interest influenced governance protocols, or difficult engineering challenges is to default on the opportunities we face, to walk away from enticing computing challenges and disruptive possibility, and to choose business as usual. Focusing our best minds and our creative capital on digital tools that destroy civil liberties and threaten employment opportunities while ignoring those that would conserve our natural resources and enhance human dignity, will be to hasten our demise as free, peaceful people.
All of us - creators and users of digital tools - need to get out from behind our willful blindness and acknowledge that How We Use Digital Data is as important as what we do with it. Our digital lives depend on the ethical choices we bring to - and that we demand of - the digital spaces that are substructural to our daily actions. We must now take to the streets, to the classrooms, to our open plan workspaces, to our lawmakers, and to the board rooms to protect our digital rights and enhance our humanity.
We, and only we, can can lead us into an era in which our human, civil and democratic rights are protected in digital spaces by design and by default.
- People need to protect themselves and demand protections in the products they use and from the companies they purchase from
- We need to insist on government action that aligns with the founding principles of democracy and doesn't toss them aside in favor of cowardly falsehoods about national security or economic competition
- Organizations and individuals need to use their market power to demand digital products that they can use without compromising their social missions
- Tech companies, hardware/software makers, telecommunication firms, and app designers need to lead and be rewarded for person-protecting consent, privacy, and security practices, transparency and auditability.
- Business people need to stop resting on incumbent explotaitive revenue models. Now is the chance for true innovators to demonstrate an ability to produce economic value in line with human and democratic values
This is a metaphysical question but one that may help you think about using digital data safely, ethically, and effectively at your organization. I've been mulling over this question for awhile and it seems there are many ways to conceive of the value and role of digital data to you and your organization:
I'm bingeing again on the Raw Data podcast (Season 2!) and several of the episodes - plus the reflection on season one - make it clear that there are lots of ways to think about digital data.
- As resources, like time or money
- As assets (and liabilities)
- As relationships
- As a context or place
- As a lifecycle
- As a multiplier or expansion strategy
- As ones and zeros, a binary language of representation
Different people think about digital data in different ways. Someone involved in fundraising may see the digital data held in the organization's CRM system as evidence of the relationships they manage. The IT staff may see digital data as a cycle of responsibilities and vulnerabilities. Communications experts may think of online as a place or a context. Program staff may wonder how data can be used for greater reach or deeper insights. (I'm not sure how these different roles line up or not with these different mental maps - might be an interesting thing to ask your colleagues)
How you think about digital data (and how your colleagues do) can inform who needs to do what when you're thinking about your foundation's or nonprofit's data management and governance responsibilities.
This year's Blueprint includes several worksheets you can adapt to your organizational needs - to think about what data you have, what skills you need, and how data can help, or hinder, your pursuit of mission. Check out the worksheets here.
And let me know - how do you think of digital data?
Download it for free here