From same-day delivery access to internet availability, algorithms built with problematic data are influencing our daily lives and exacerbating structural racism and inequality. The Undesign the Redline @ Barnard Symposium aimed to shed light on a serious and often-overlooked issue: digital redlining.
On November 19, the Undesign the Redline @ Barnard Symposium hosted “Digital Redlining,” a discussion about data science, inequality, and the impacts of redlining. The panel was moderated by Saima Akhtar, Associate Director for the Vagelos Computational Science Center at Barnard.
Akhtar first introduced Greta Byrum, program direction for the Social Science Research Council’s Just Tech program and co-creator of the New School’s Digital Equity. The next speaker was Dr. Chris Gilliard, a writer, professor, and speaker whose work focuses on the intersections of race, class, and technology. Finally, Emmanuel Martinez, a data reporter for The Markup, spoke about mortgage discrimination.
The term “redlining” refers to a neighborhood-ranking system created by the Home Owners’ Loan Corporation (HOLC) in the 1930s. The HOLC constructed color-coded maps to illustrate mortgage lending risk, and “high risk” neighborhoods were outlined in red. These neighborhoods tended to have a larger population of low-income families and people of color. This practice has since become illegal, but the effects of redlining are still seen today in a variety of ways, from the location of hazardous waste facilities to the availability of parks and other green spaces.
As the world becomes increasingly digital, the effects of redlining have infiltrated algorithms and data. As Byrum explained, the “underlying racism of code” contributes to “algorithmic oppression,” and this is not a glitch.
“The algorithms themselves are built using problematic data, and they go on to predict problematic risk assessments,” Byrum said.
During a pandemic, the effects of digital redlining can be matters of life and death. Gilliard stated that disparities in who can get medicine delivered to their doorstep, who can access information about healthcare, and who can schedule a vaccine appointment online show that the legacy of the HOLC’s redlining practices is still very much alive.
“The placement of surveillance in communities often falls along these same lines,” Gilliard said. “It maps directly onto a lot of the same maps that we would see if we looked at HOLC maps on a particular area.”
Martinez also emphasized that, even more than 50 years after the passage of the Fair Housing Act of 1968, people of color are still more likely to be denied mortgages than white people. Risk calculations used in these decisions are using algorithms that are repeating the practices of the past rather than accurately predicting the future.
“What all these algorithms are doing is compounding the systemic inequities, the systemic and structural redlining that was once legal, and repeating it back,” Martinez said.
During the Q&A portion of the panel, the speakers were asked about any sources of hope that they have when it comes to the problem of digital redlining.
For Byrum, seeing members of the communities in Detroit and the Bronx working together to install solar power in their neighborhood and claim ownership of their local infrastructure is a source of inspiration.
Martinez said that, for him, the hope lies in the fact that conversations about how data and algorithms are not objective are taking place, and they are happening more often than ever before.
Gilliard, who advocates for large-scale action by a collective to motivate companies to change their practices, said, “The ways in which people are banding together, whether that’s people who live in an apartment building, civil society groups, or collective bargaining and unions at tech companies, are really important and really good.”
Just like the HOLC’s redlining practices, digital redlining is rooted in structural racism and inequality, and it influences the opportunities, amenities, and services that individuals can access. Data scientists are typically not experts in peoples’ lived experiences, and as long as problematic data is being used to inform risk assessment and other algorithms, the results will continue to compound the harmful legacy of redlining.
“To undesign this problem,” Byrum said, “we need to turn toward the messy and the real, and where things are felt on the ground… We need to actually build a different internet.”
redlining via Barnard