Introduction
A social issue that is exacerbated by digital media is that people of marginalized racial and financial backgrounds are being ostracized as big tech companies expand and cater to upper class audiences. The expansion of digital media targets people of lower socioeconomic statuses through the removal of public goods, policing via surveillance, and even misrepresentation in artificial intelligence-based algorithms. This can be examined through how companies such as Meta and Amazon have been distributing their products through various groups of users in our society. The growth of these multibillion dollar corporations have been coming at the cost of lower income families losing access to resources in public libraries, public healthcare, and even ownership of small businesses. Not only this, but as big tech companies infiltrate society, they are gaining access to data on all users, thus introducing the issue of surveillance and invasions of privacy. People of color, and marginalized backgrounds fall victim to surveillance because they are being targeted by algorithms and security companies as outsiders or antagonists to the rest of society. In order to mitigate the reinforcement of systemic racism and bridge the informational gap between upper and lower economic classes, the public should create more opportunities for more people of color to participate in the development of equitable technology by introducing specialized job roles and providing more free access to emerging technologies.
The Loss of Public Goods and the Expansion of Big Tech
The removal of public goods can have a disproportionately negative impact on people of lower socioeconomic statuses. This is because public goods, such as parks, libraries, and public transportation, provide access to resources and opportunities that can be essential for individuals and communities to thrive. Without these resources, people of lower socioeconomic statuses may have limited access to education, employment, and other opportunities, which can prevent them from reaching their full potential and lead to further economic inequality. Additionally, the lack of public goods can make it difficult for people of lower socioeconomic statuses to participate fully in their communities and can lead to social isolation and marginalization. Especially as public goods are becoming privatized, it is worth noting that Constitutional laws do not yet address the legality behind removing public goods from marginalized communities - rather there are mainly legislative measures designed to protect corporations who aim to expand their positions in the economy. (Sullivan, 1987) The removal of public goods from society adversely affects people of lower socioeconomic backgrounds, because they are now having to pay for resources that they once had free access to - thus furthering the informational gap between the upper and lower financial classes.
Big Tech companies find opportunities to profit off of crises, and at times, they are creating some of these crises themselves. For example, they’re known for selling surveillance wares to law enforcement, capitalizing off of AI, as well as making profit from contact tracing technology in response to the COVID-19 pandemic. (Noble, 2020) In the piece, “The Loss Of Public Goods To Big Tech” by Safiya Noble, she argues that tech companies are undermining public institutions and the spread of reliable information, to instead promote false and harmful information. Noble then calls for a focus on investments in public health, education, and media, and for the dismantling of technologies that exacerbate inequality and injustice. Noble’s argument about the spread of misinformation and its contribution to the information gap is further supported by a study conducted by the Pew Research Center which examines how Americans who primarily get their news on social media are less informed about current events, and view them in a skewed manner due to racial and political bias that is engrained in social media algorithms. (Mitchell, et.al, 2020) The study includes the following:
“One specific example is exposure to the conspiracy theory that powerful people intentionally planned the COVID-19 pandemic, which gained attention with the spread of a conspiracy video on social media. About a quarter of U.S. adults who get most of their news through social media (26%) say they have heard “a lot” about this conspiracy theory, and about eight-in-ten (81%) have heard at least “a little” – a higher share than among those who turn to any of the other six platforms for their political news.” (Mitchell, et.al, 2020)
A deeper analysis of this data suggests that the audiences most susceptible to the misinformation of this conspiracy theory fall into lower socioeconomic statuses, as a result of them not having access to means of information that guarantee accuracy.
Manuel Castells continues this conversation in his piece, “An Introduction to the Information Age.” He posits that the rise of the information age has caused an increase in social polarization and exclusion because while increased access to information and technology has become available to society, it is mainly being geared towards upper class audiences - thus creating the informational gap and informational capitalism. (Castells, 1997) He argues that informational capitalism has led to long term unemployment, higher rates of incarceration, crime and illness because audiences who are below the poverty line are less likely to have access to vetted and reliable resources that can help them - as highlighted in the preceding example. Castells also proposes the concept of the Network Society, which manifests itself as a byproduct of growing capitalism and changes in economy due to how power dynamics have significantly shifted in a manner that puts racial minorities at a disadvantage. (Castells, 1997)
A sense of the modern Network Society can be gathered from “Introduction: The Digital Edge” by Craig S. Watkins. Watkins in this piece explores the digital divide and the toll it takes on economically disadvantaged students especially as the digital sphere continues to grow. The article’s focus is placed on Freeway, Texas and its potential to be a technological exemplar for the future - or rather, a Network Society. Freeway’s history in housing youths of marginalized backgrounds also presented an opportunity to develop a more detailed understanding of the media practices forming in the daily lives of black and Latino teens. (Watkins, 2019) Additionally, Watkins explores the implications of this evolving digital media ecology for learning, opportunity social mobility. He also discusses the digital edge, which is a reference to the institutions, practices, and social relations that make up the daily and mediated lives of black, Latino, and lower-income youth. Much of this was dictated by the media platforms that lower-income youth would interact with on a day to day basis. A similar notion was also discussed in “Circuits and Consequences of Dispossession: The Racialized Realignment of the Public Sphere for US Youth,” by Michelle Fine and Jessica Ruglis. They highlight that students of African American or Latino ethnicities are rarely accommodated by the American school system because as a result of systemic bias, the government provides more public education resources to communities that have primarily White or Asian students. They also argue that these discriminatory policies, produce cumulative disadvantage for students of color and privileges for white, especially wealthy, students. These policies also have negative consequences in other sectors, including economics, health, and criminal justice, and create a racialized geography of youth development and dispossession that is often seen as the norm. (Fine, et.al, 2009) In conjunction to that, a key takeaway from Watkins’ text is that deprivation of public resources and facilities significantly impacted Freeway students’ ability to flourish in their trajectory towards scientific and technological advancements. According to the study conducted in Watkins’s piece, students in Freeway were introduced to two new technology-based classes as part of a new curriculum that promotes tools literacy. It was observed that the implementation of the new curriculum was a source of learning and adaptation to newer technologies not just for the students, but also for the teachers who were responsible for teaching the classes because a lack of public resources hadn’t allowed for them to have affordable access to amenities like this before. (Watkins, 2019) It can be concluded that the expansion of big tech has disadvantaged many people, of all ages alike, due to the fact that they lost affordable access to public resources such as libraries that once allowed them to learn from reliable sources.
Racial Policing and Data Surveillance
Another way in which people of color and lower socioeconomic backgrounds fall victim to big tech companies is thorugh increased racial policing/surveillance, as it mitigates their opportunities for housing, finding careers, and increases their risks of being falsely incarcerated. This is discussed briefly by Noble, but more so expanded upon in the text “Prison Tech Comes Home,” by Erin McElroy, and others. This text encapsulates how surveillance technologies have been more widely used in order to target and discriminate against people of color in their own homes. Throughout the COVID-19 pandemic, new surveillance systems—used by landlords, educational institutions, and employers—have converged, capturing new forms of data and exerting new forms of control in domestic spaces. Landlord tech is being used to discriminate against prospective tenants and existing tenants in terms of those who are people of color or previously incarcerated. (McElroy, et.al, 2021) According to the text,
“As of March 2021, over 10 million renters couldn’t pay their rent due to COVID-19-related hardships; by June 2021, 5.8 million renters (14 percent of all renter households) were behind on rental payments, which added up to $20 billion national rental debt, according to the National Equity Atlas. Meanwhile, corporate landlords, including Blackstone, are following their familiar disaster-capitalist playbook to amass huge profits and develop new markets, for instance in commercial real estate. And landlord-tech companies are aiding them in this pursuit, extending corporate landlords’ reach, with catastrophic examples near and far.” (McElroy, et.al, 2021)
From this example, it can be gathered that the capitalist gain in the landlord tech economy comes at the expense of safe, affordable, and secure housing for people of color and lower socioeconomic statuses. Mass evictions of residents took place as a result of implementing invasive surveillance technology, as they tracked the activity of seemingly “disruptive” tenants and used baseless evidence to deem them unworthy residents. For instance it is posited that, “[...] companies such as Naborly added new features to their tenant-screening software, offering to track tenants unable to pay rent during and after COVID-19. Naborly’s private screening bureau aggregates data into reports of ‘delinquent tenants,’ which then get sold back to landlords. This service allows landlords to effectively blacklist people experiencing financial hardship and prevent them from securing future shelter.” (McElroy, et.al, 2021) This therefore proves that the expansion of surveillance technology puts people of color and low socioeconomic statuses at a disadvantage due to algorithmic and product marketing bias.
While this form of discrimination exists in a space where facial recognition technology detects people of color, there also lies the problem in what happens when artificial intelligence misreads facial data from people of color. Some AI technologies do not recognize the faces of people of color as well as they do with White features. However, when these faces are recognized, it is rare that the algorithms work in favor of them, and pose them as a threat or a point of target. This is presented in “Gender Shades”, a study conducted to examine how AI facial recognition technologies often misidentify and misgender people of color as a result of algorithmic bias. This results in consequences such as misidentifying a suspect for a crime, or inadvertently instilling harmful racial/gender stereotypes. The study explored facial recognition systems in companies such as Microsoft, IBM, and Face++ to reveal that they all performed most accurately with subjects of lighter complexions, while having some forms of error for subjects of darker complexions. It is declared in the study that, “The intersectional error analysis that targets gender classification performance on darker female, lighter female, darker male, and lighter male subgroups provides more answers. Darker females have the highest error rates for all gender classifiers ranging from 20.8% − 34.7%. For Microsoft and IBM classifiers lighter males are the best classified group with 0.0% and 0.3% error rates respectively. Face++ classifies darker males best with an error rate of 0.7%.”(Buolamwini, et. al, 2018). It is important that facial recognition technology improves in its accuracy and eradicates the bias in its algorithms to better present factual information as well as allow for fair representation of people of color. This is because people of color and those of lower socioeconomic backgrounds are more concerned about how much they are surveilled due to its direct impact on how much they are oppressed or policed.
Bridging the Informational Gap and Building Equitable Technology
Issues such as the removal of public goods and racial policing via data surveillance are incredibly harmful to people of color and those of low socioeconomic status because they reinforce the cyclical framework of racial discrimination and social divide. In order to bridge the informational gap and integrate upper and lower economic classes to the growing digital atmosphere, the public should create more opportunities for people of color to participate in the development of equitable technology. Charlton McIlwain in his piece, “Racial formation, inequality and the political economy of web traffic, Information, Communication & Society” discusses the urgent need for understanding how racial inequality is produced and systematically propagated on the internet, rather than just focusing on how people use online platforms for race-based or racist goals. He also proposes using racial formation theory, which looks at the historical context and circumstances that shape how race is understood and represented, to understand how the internet produces tangible forms of race-based inequality. (McIlwain, 2017) This proposal is imperative to the creation of future technologies, and the reformation of current ones because it will allow for deeper understanding behind racial bias that is embedded in current algorithms, to then remove them in later iterations. McIlwain also states that,
“[...] once we know that segregation and disparate value exist in the online environment, we know what questions remain for research to ask and answer to fully determine whether and how racial inequality may get produced on the web. These remaining necessary, and most significant questions are these: what actual value do site rankings possess? What traffic advantage(s) are to be gained from having higher site rankings? What disadvantages(s) are there to having lower site rankings? Finally, what are the real implications of, or–differently stated–what is the tangible ‘harm’ for a site (and, presumably its human owner(s)) that is disadvantaged in this traffic network?” (McIlwain, 2017) This is insightful in the sense that once algorithmic bias is acknowledged and researched upon, it will be easier for future developers to redirect the narrative and produce more equitable technology.
Another beneficial solution to introduce would be to reintroduce public goods such as libraries to more communities, if not even more resources than that in order to encourage the usage of technology for people of all backgrounds. This would serve the purpose of minimizing the informational gap that is currently present in society as a result of the expansion of big tech, and its direct correlation to the removal of free access to educational materials. Julia DeCook, in her piece, “Tech Will Not Save Us: The Subjugation of Politics and Democracy to Big Tech,” discusses the relationship between digital infrastructure and politics alongside its impact on several racial groups. She discusses the maladies that took place against Asians as a result of misinformation being spread during the COVID-19 pandemic, as well as the fact that big tech companies such as Apple, Alphabet, and Meta actively avoid taking responsibility for extremist content that circulates on their platforms. Therefore, she argues that big tech companies play a significant role in enforcing the demise of democratic rights in the United States - one of those rights being the ability to have access to reliable means of information. (DeCook, 2020) While many public libraries have been replaced by private institutions throughout the United States, it would be beneficial to reintroduce public libraries in a different manner. For example, these libraries can provide workshops to young students of low income in order to educate them on new technological tools they can use to adapt to the evolving digital sphere. Another key implementation to this proposal would be to increase access to free or affordable WiFi. While this can be seen in New York City, through the addition of LinkNYC hubs for free WiFi access, it would be far more influential if there was something like this available to people of low income in non-urban areas throughout the United States.
Lastly, it is important to incorporate the ever-evolving technology industry into our nation’s laws and legislations. For example, as mentioned earlier, surveillance technology’s invasive nature has resulted in several false incarcerations, while also displacing people of color from their homes due to landlord tech-related evictions. Policymakers should propose a bill that protects the rights of those who are surveilled, while also being transparent about what kind of data is being collected. Subjects of surveillance technology should have the right to refuse having their data collected for the sake of their privacy, especially if they are being targeted due to discrimination. Not only this, but policymaking also needs to take place to mitigate the monopolization of big tech companies as they gain profit from social and global crises. Gabriel Iason’s piece, “Toward a Theory of Justice for Artificial Intelligence, ” provides discourse on the relationship between artificial intelligence and principles of distributive justice. The central argument holds that the basic structure of society should now be understood as an entity that coexists with evolving technology. That being said, norms and laws of justice that pertain to society in the general context should also be applied to digital infrastructure as well as artificial intelligence. These norms should entail that AI systems must meet a certain standard of public justification, support citizens’ rights, and promote fair outcomes for all users, especially for those who are already at a disadvantage at the hands of the law.
Therefore, in order to mitigate the cyclical reinforcement of discrimination against marginalized communities we should promote increased engagement of people of color within technological development to allow for diverse inputs in racial formation theory. Not only this, but reintroducing public goods that have been privatized back into communities will allow for bridging the informational gap between the lower and upper financial classes. Ultimately the goal of this is to ensure that every community regardless of financial status has access and/or knowledge of pivotal technological developments in order to promote equity. Lastly, changes in policymaking need to take place in order to accommodate for growing technologies, and the societal shifts that take places as a result of them.
Conclusion
The expansion of big tech companies and the removal of public goods have had a negative impact on marginalized communities, particularly those of lower socioeconomic statuses and people of color. This has contributed to severe economic inequality, widening of the informational gap, reinforcement of systemic racism, as well as the issue of data surveillance and racial policing. To address these issues and bridge the informational gap between upper and lower economic classes, it is important to create more opportunities for people of color to participate in the development of equitable technology, and to provide more free public access to emerging technologies. It is also necessary to invest in public institutions such as health, education, and media, in order to dismantle technologies that exacerbate inequality and injustice. This can be done through changes in the United States’ policymaking that stay up to par with expansions in the big tech industry, while also catering to the safety and security of those who are of marginalized racial and socioeconomic backgrounds. By addressing these issues, we can work towards creating a more equitable and just society in conjunction with the growing digital sphere.
Bibliography
Brown, Sara. (2020). 3 ways to make technology more equitable. MIT Sloan. https://mitsloan.mit.edu/ideas-made-to-matter/3-ways-to-make-technology-more-equitable
Buolamwini, J., Gebru, T. (2018) "Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification." Proceedings of Machine Learning Research 81:1–15, 2018 Conference on Fairness, Accountability, and Transparency
Castells, Manuel. 1997. “An Introduction to the Information Age.” City.
DeCook, Julia R. 2020. “Tech Will Not Save Us: The Subjugation of Politics and Democracy to Big Tech.” International Journal of Critical Diversity Studies. Vol. 3(2):73-79. DOI: 10.13169/intecritdivestud.3.2.0073
Fine, Michelle, and Jessica Ruglis. 2009. “Circuits and Consequences of Dispossession: The Racialized Realignment of the Public Sphere for US Youth.” Transforming Anthropology.
Gabriel, Iason . (2022). “Toward a Theory of Justice for Artificial Intelligence.” Daedalus, 151(2), 218–231. https://www.jstor.org/stable/48662037
McElroy, Erin, Meredith Whittaker, and Nicole E. Weber. 2021. “Prison Tech Comes Home.” PublicBooks.
McIlwain, Charlton (2017) Racial formation, inequality and the political economy of web traffic, Information, Communication & Society, 20:7, 1073-1089, DOI: 10.1080/1369118X.2016.1206137
Noble, Safiya. 2020. “The Loss of Public Goods to Big Tech.” Noēma Magazine.
Sullivan, H. J. (1987). Privatization of Public Services: A Growing Threat to Constitutional Rights. Public Administration Review, 47(6), 461–467. https://doi.org/10.2307/975887
Watkins, S. Craig. 2019. “Introduction: The Digital Edge” in The Digital Edge: How Black and Latino Youth Navigate Digital Inequality. NYU Press