We need to talk about 311 data
Data from citizen reporting platforms like 311 has biases that could create disparities in how local governments distribute public resources. So why are they still using it?
When local governments start digging around for data they might analyze to improve public services, they can end up using datasets that are loaded with bias without taking a second look.
311 is a non-emergency phone line that allows residents to report issues like broken street lights, street cleaning, potholes, overgrown vacant lots, and other generally low-level nuisances. In any city that has a 311 line, the data catalogs these complaints.
People like to work with 311 data because it can be automatically updated on a daily basis, and it’s connected to specific times, locations, and service codes, meaning it’s easy enough to analyze.
In theory, it should be the perfect way for local governments to learn about what’s on residents’ minds and get a general indication of neighborhood well-being. The idea is that local governments could analyze 311 data to understand what their residents want and need.
But residents don’t just use 311 to report potholes and vacant lots.
People also use 311 to report houseless encampments, graffiti, noise, and loitering. As a result, 311 data is loaded with racial bias that often goes unexplored and overlooked when local governments use it to understand local needs.
It’s important that 311 programs exist because they reduce the burden on 911 responders to respond to non-emergencies, and they provide residents with a way to seek help with everyday issues.
But the data is too biased to shape how governments deliver services or design policies, and few local governments or experts are raising awareness enough to prevent this bias from reinforcing the inequities in how public services are designed and distributed.
Too many local governments are using 311 data at face value.
Local governments tally up the number of potholes to be fixed, look into areas with high counts of noise complaints, and try to clean the streets where people report trash. In a very tangible way, local governments use this data to prioritize where they provide services across their towns and cities in real time.
Experts reinforce this practice by encouraging local governments to analyze 311 data to shape public services, including by using artificial intelligence to analyze the data and respond to resident requests, despite core issues with the data.
But multiple studies have shown that 311 data is a more accurate predictor of geographic breakdowns of class, race, and neighborhood change than it is a predictor of a community’s actual needs.
While, on the surface, it is a good thing that local governments use 311 to quickly respond to residents’ needs, research by Joscha Legewie and Merlin Schaeffer in 2016 showed that a higher density of 311 calls come from transitional zones where neighborhoods are in the process of gentrifying, and white residents come into contact with their new neighbors who are people of color.
Research by Ariel White and Kris-Stella Trump in 2016 also showed that participation rates in 311 are negatively correlated with similar low-cost civic engagement activities like turning out to vote or returning the Census, and highly correlated with high-cost activities like contributing to political campaigns.
Taken together, these studies paint a picture of who is using 311 the most to express their concerns and, by all appearances, it’s mostly rich White people.
Since 311 intake doesn’t include collecting demographic data about the callers, there isn’t a way to prove exactly how many people of different racial and economic backgrounds are calling in. These trends could vary across cities.
But we do know that across the country, Black communities, historically and presently, face organized abandonment and divestment from basic infrastructure needs like drainage and street cleaning. In fact, we have mountains of data other than 311 that prove that there are significant non-emergency needs in most low-income communities of color.
As I mentioned in my previous writing on dashboards, local governments often already have data that shows them existing lines of inequality and the resulting disparities. (See also: my favorite series from Houston’s One Breath Partnership on recurring geographic patterns of inequality like the Houston Arrow). Instead, leaning on 311 data to understand residents’ needs reinforces societal disparities in how public services are distributed.
So why does 311 data continue to appear as a reliable indicator of community need?
Most local governments don’t have a way to vet data for bias.
Organizers of a New York City Data Jam in 2017 posed a question that few governments ask to evaluate bias in 311 data: “Is this where the streets are in the worst condition, or is this where people are complaining about the streets the most?” (Thanks to the good people at BetaNYC, the materials are open, so you can explore the 311 data challenge yourself.)
Tracking bias in local government data would first require better documentation, or metadata, on how data is collected, formatted, and analyzed. Then, data users could answer questions about the quality of the data for themselves.
For example, a city planning survey might show that a majority of residents responded “Yes” to a question indicating a desire for the city to build more parking spaces. But after looking at the metadata, you might see that the language of the survey question was, “Would you like more parking spaces on your street to address the recent increase in parking violations in your neighborhood?”, which introduces bias toward a single proposed solution to a wider problem which respondents might be indicating an interest in resolving.
Basically, your interpretation of data can always change based on the data’s context. All data carries some bias. So it’s important to understand that bias and document it for people who are going to use the data in the future.
The CIVIC Data Library of Context is one project attempting to expose bias through metadata and encourage more ethical practices of data governance. In my team’s recent work for the Beeck Center’s Data Labs program, we brought experts from CIVIC in to speak with state government data officials about best practices for identifying and mitigating bias in their data.
State and local governments should be investing time and resources to ensure that data is well documented, that steps have been taken to reduce bias, and that data is used in ethical and equitable ways. When it comes to addressing the bias in 311 data, we have to go a step further to understand the forces in real life that create bias in the data.
Is 311 even working as a non-emergency helpline or has it been co-opted for neighbor-on-neighbor policing?
311 is unequivocally a tool that matters for people to have access to their local government. But there’s plenty of unpacking to do around why one person would trust a 311 helpline, while another wouldn’t.
People advocating for cleaner streets or better quality of life in communities of color might have lost trust in local governments’ responsiveness to their needs after decades of divestment. But people belonging to privileged groups like homeowners, campaign contributors, and gentrifiers might still use access to authority to wield power.
Take Legewie and Schaeffer’s research on contested boundaries which shows how White residents use systems like 311 to get authorities to reinforce the social and structural boundaries between them and their non-white neighbors.
Society’s tacit support for monitorial citizenship through platforms like Nextdoor despite issues with racist reporting has created an environment that is outright dangerous for people of color.
As scholar Apryl Williams noted in a discussion of her work last year, this culture has allowed White complainants to “perform racialized fear” and create potential life-or-death consequences for Black people and communities. As a result, cities like San Francisco have passed legislation classifying racist calls to 911 as a hate crime to deter monitorial behavior that could end in police violence.
In New York, a report by the city comptroller found that the city’s use of 311 to enforce social distancing led to a majority of summonses and arrests being issued in low-income communities of color. And while most 311 websites don’t direct people to report issues like houseless encampments, many people report them anyway.
For example, in 2021, the City of Boston received 342 complaints about tent encampments that the data says they log as “information” rather than as a service request. This information could then be used by law enforcement to clear encampments in the future.
But the same Boston 311 dataset from 2021 showed that while some “concerned citizens” were reporting their neighbors for parking their cars incorrectly or letting their dogs off leash, a majority of people were trying to report things like accessibility signage missing from public pathways or unsanitary living and working conditions.
I can’t say for sure what a more democratic 311 system would look like, it could start with local governments stepping up to enforce rules against racist or discriminatory 311 reporting in the first place. It could start with making sure that 311 programs are actually helping people who need it the most.
Our communities are in need of a serious reckoning to shift the meaning of public safety away from surveillance and toward community care in general. By beginning to address the roots of the biases influencing 311, the data might, maybe, possibly, one day become all it’s cracked up to be.
Have a relevant data project to share? Want to be connected with others working on this issue?
LINKED THINKING
When Race Leads to a Call to 311… - Coverage in the Atlantic on the demarcation of boundaries between racially homogenous neighborhoods emerging in 311 data.
White Surveillance and Black Digital Publics - Dr. Apryl A. Williams and Dr. Allissa V. Richardson on White vigilante-style surveillance of Black people in public spaces and the digital artifacts that contribute to collective action in response to this surveillance.
Peoples Tech Assemblies: Democracy and Access - Fireside chat on engaging in local government, hosted by BetaNYC in collaboration with the NYC Office of the Public Advocate.
From Data Criminalization to Prison Abolition - “A dominant mode of our time, data analysis and prediction are part of a longstanding historical process of racial and national profiling, management and control in the US
This "neighbor-on-neighbor policing" is a real problem. In general I'm a big supporter of online anonymity – people being able to speak their piece in public without disclosing their identities.
But when it comes to reporting other people's activities to government officials in ways that could get them arrested, cited, towed, or have other negative and potentially life-threatening consequences, I think we need to draw the line.
The right to confront your accuser in court is a cornerstone of the justice system for a reason, and setting up a system that allows people to make harmful and sometimes false accusations without any personal consequence is rife for abuse.
This is super thorough work! Thank you for bringing this information to the forefront. I've been an advocate for using 311 to report on lack of heat or hot water, but didn't know how extensive the 311 reporting system was until I read your newsletter.