Step into the light, they said
When the rich and powerful keep privacy for themselves and build systems of surveillance for everyone else.
¡Buenos dias todxs! Before jumping in, I’d like to share a recording of my interview on data infrastructure and “consumer” data with community journalist Sam Oser for her show the Unconventional Journalist on All Real Radio. Please give it a listen!
All Real Radio is a local radio station founded in 2014 and based here in Houston’s Third Ward. Sam’s show reports on the movements that fight back across the Gulf Coast region. You can find more of her work on Instagram and Twitter @samthemullet. Thanks & enjoy 💜
When I worked with city governments launching open data programs with the Sunlight Foundation, a question we’d get all the time was: How do we know what should be private and what should be transparent? What’s the right balance? Basically, they didn’t want to let the “wrong” stuff loose.
I had an answer down but it was only ever a half-answer: “If you release something wrong or harmful and someone lets you know, you can consider that a favor. Better to know your data’s flaws than to use it without knowing.”
I knew that a principle I believed in was that governments should be transparent. But privacy is also important when it comes to protecting the information of people represented in the data.
I’ve noticed in general that people want to pit privacy and transparency against each other, like opposites, when really they are two sides of the same coin, or two modes on the same spectrum.
Where we draw the line between privacy and transparency when it comes to sharing our data is an incredibly important question. For example: You might believe that the line should be drawn around you as an individual — that everyone should have the right to total privacy, even if it means shutting out community, government, or any other outside prying eyes. On the other hand, you might believe in radical transparency — that sharing everything with everyone could bring about more communal understanding and togetherness.
Obviously reality is and will always be somewhere in the middle. But often we’re not the ones deciding for ourselves (if we were, that might constitute having data rights). People with institutional and corporate power are often the ones drawing the lines around who gets to protect themselves from surveillance and who has to share.
Powerful people have already decided who deserves privacy, and it’s themselves.
In many of the conversations, roundtables, webinars, and panels that I’ve been to about the potential risks of surveillance from emerging technologies, people regularly point out that poor people or people with marginalized identities are the most at risk because they are the most surveilled by public programs, systems of policing, or consumer technologies.
There was one conversation in 2018 with a group of government policy and privacy experts at NYU to discuss a sort of safety label called the Trustable Technology Mark that would work like a Fair Trade or Organic label to let consumers know if technologies were safe.
The conversation ended on a pessimistic note. After lots of back and forth about the potential uses, benefits, or risks of the mark, one point of consensus in the room was that the trustmark might not actually solve the surveillance problem.
Looking at the Fair Trade mark, people pointed out that often wealthy people could afford Fair Trade goods and poor people couldn’t. In the same vein, we thought rich neighborhoods or households would end up with “trustable technologies” and everyone else wouldn’t.
I remember thinking this was like a far-off reality, that if things got worse, the world could become that way: privacy for the rich, surveillance for everyone else.
And then, last week, I was talking to my friend Sam about Taylor Swift’s private jet (icymi; there’s a Twitter account posting Swift’s PJ flights to raise awareness about celebrity and billionaire carbon emissions) and she told me that in response to backlash, Congress quietly passed legislation in May allowing PJ owners to be listed anonymously in public datasets, making TayTay’s PJ almost impossible to track.
But that’s not all! There is a ton we don’t know about the corporate world where billionaires and other power players use their wealth or influence to buy privacy. One of my favorite creators on TikTok uses the handle CancelThisClothingCompany to systematically analyze ownership of every product on grocery store shelves.
Using a combination of public information and inference, he makes “ownership sheets” which he posts on his website. I like that he shows us how easily private equity and corporate power players can hide their outsized influence on our day-to-day lives. Ironically, the slogan on his t-shirts is almost the same one we had on my team at Sunlight: “Information is Power”.
What I’ve realized is that the thing we had all feared in that 2018 workshop is already happening, and was already happening then. It’s not just scary because there’s so much we don’t know about the corporate world. It’s scary because the same people who are hiding behind their right to privacy are the same people building surveillance systems that demand transparency from everyone else.
We can’t talk about the future of data rights without addressing the surveillance status quo.
In a 2018 research paper called “The Surveillance Gap: The Harms of Extreme Privacy and Data Marginalization”, authors Michele Gilman and Rebecca Green outline the ways in which people who aren’t surveilled by public systems can miss out on essential life-sustaining public services and are essentially considered non-entities by institutions.
The term in the title, “the surveillance gap,” is a backwards term to describe the gaps in government surveillance systems, suggesting that a lack of complete surveillance is the reason why people fall through the cracks.
The paper points out that this is a flawed term and suggests that at-risk groups like houseless people, undocumented people, or day laborers deserve dignified pathways to being registered in public systems that build privacy rights. But as we’ve seen, privacy rights already exist, they just exist for the wrong people.
The idea that people have to “come into the light” to receive services is a coercive logic, in alignment with reasoning that the rich and powerful use to ensure that they have the right to see everyone and everything while staying hidden themselves.
In my opinion, “we can only serve you if we see you” creates a sad excuse for a public safety net. I believe good food, secure shelter, and clean water should be available to everyone without condition.
That mindset is part of the harmful neoliberal bargain that advocates for the existence of surveillance through justifiable means instead of embracing principles of data rights or ideas like the right to opacity, and working to build consentful technologies from there.
The right to opacity was a term introduced to me by Yeshi Milner at Data for Black Lives whose work on data justice is helpfully put in context in this chapter from Elizabeth Rodrigues’s “Collecting Lives.” It’s a term originating from Edouard Glissant who wrote:
“If we examine the process of ‘understanding’ people and ideas from the perspective of Western thought, we discover that its basis is the requirement for transparency. In order to understand and thus accept you, I have to measure your solidity with the ideal scale providing me with the grounds to make comparisons and, perhaps, judgments. I have to reduce.” (Poetics of Relation, 189-194)
He says essentially that the right to opacity preserves our right to difference, to differentiate ourselves, to preserve diversity, and to allow our differences to intersect each other rather than assimilating into one dominant paradigm. Data for Black Lives and other thinkers in data justice like Ruha Benjamin use similar concepts to advocate for not just a right to privacy but a right to dignity and embodied self outside of data systems shaped by systems of oppression.
These are not only rights that we should protect for people who live at the intersection of systems of oppression, these are also rights that the people on top have already claimed for themselves.
Transparency should flow in the direction of power.
I have to credit this quote to Stephen Larrick, who was my first teacher at Sunlight and who helped me get started in opengov world. The idea has really stayed with me because it’s relevant across so many contexts. (Thanks, Stephen.) And because in our world today, the rule is flipped.
It’s actually a simple distillation of how transparency and privacy can work together. People and institutions with more power should be more transparent. People without power have a right to defend themselves from being reduced even further by forced transparency.
As power dynamics shift, transparency and privacy can also shift. That’s why we need data rights, to upend this imbalance.
The shift toward power-informed data rights and a shift away from the status quo may be uncomfortable for some people, especially those in positions of influence or power who feel entitled to the data of working people.
Under the new paradigm, communities may choose to share data with each other as a form of power-sharing and trust. But institutions that don’t share power like governments or universities may be left out.
This opens up new possibilities for people who want to share data with each other as part of their work to build just and liberated communities. My rule of thumb: Transparency for people who share power with me, opacity for everyone else.
Thank you for sharing a new paradigm! As someone in communications I've seen how personal and identifiable individual consumer data is so easily used so casually and in the interests of the corporate and elite classes. The notion of data rights shouldn't sound radical but considering the status quo it really is. Thanks again