Internet: The Machinery of Global Division

Image credit: Zhengyang Zhang

RYAN HOOVER

Read the Faculty Introduction here.

In the spring of 2014 in Fort Lauderdale, Brisha Borden and Vernon Prater were both charged on separate counts of petty theft for $80 worth of goods. They were both assigned algorithmically generated risk assessment numbers on a scale of 1-10 by the local police to determine the likelihood of criminal re-offenses. One-time misdemeanor offender Borden received a 9 (high risk), while convicted armed robber Prater received a 3 (low risk). If it is true that the internet has such innovative technologies, like predictive algorithms, then how are there such wide discrepancies in various groups’ experiences with such technologies? In her 1985 piece “Cyborg Manifesto,” Donna Haraway examines the tensions between technology and race, gender, and societal status. In Jenna Wortham’s 2016 article “How an Archive of the Internet Could Change History,” she looks at the potential of the internet as an equalizer for marginalized groups. Both Haraway and Wortham believe that the internet and similar technologies are an impetus for the enrichment of the human/cyborg kind. However, their hopes are overly idealistic as they overlook two key obstacles. First, they neglect the reality that internet access is asymmetrical given physical and societal barriers. Second, they overlook the fact that internet infrastructure itself is racially biased due to the underrepresentation of people of color in the development and testing of algorithms. 

Donna Haraway’s manifesto claims that technology will equalize people across lines of race, gender, and societal status. She introduces and develops the idea of the cyborg as a postmodern and postgender being where humanity and technology merge. By doing so, Haraway challenges the suffocatingly restrictive societal constructs of gender, race, and class; she proposes the cyborg as a “powerful infidel heteroglossia” (Haraway 68). The usage of the term “heteroglossia” implies that the cyborg is a symbol for a new language that usurps the sexism, racism, and classism of conventional society. Interestingly enough, the Encyclopedia of Postmodernism defines “heteroglossia” as the understanding of language as being controlled by two opposing forces—one being the dominant, central, centripetal and the other being dominated, outside, and centrifugal (Taylor). This definition of heteroglossia is noteworthy because it connects the language of oppression to the societal oppression that exists in present day society. Haraway’s notion of an “infidel heteroglossia” is one that sees the new language of cyborg, a language that intermingles humanity and technology, equalizing all people into a middle ground. In Haraway’s ideal reality, this cyborg language desegregates the core and periphery, so that all can exist on one equal plane. The cyborg supposedly dissolves societal barriers to women and people of color, so that everyone can obtain equal opportunities with technology. However, Haraway’s vision of a future in which the traditional institutions are challenged by a unified cyborg language is an idealised one; Haraway makes suppositions about the possible societal effects of these emerging technologies.

Likewise,  Jenna Wortham, a present day media writer and woman of color, argues that the internet is a tool for equalizing marginalized groups by allowing underrepresented voices to be heard in the cacophony of history. Wortham examines how archiving the internet, as done by Wikipedia and Rhizome, provides a voice for previously peripheral groups. The idea of a single history is completely flawed in Wortham’s eyes, and she instead subscribes to the notion of a “multidimensional ledger” perspective on history (Wortham). The term “multidimensional ledger” refers to the differing perspectives of people that the internet accommodates to be seen and heard in a more interconnected post-modern society. She uses Wikipedia as an example to demonstrate how previously underrepresented people can have a platform to voice their perspectives on the internet by freely editing articles and historical chronicles. The multiple identities of previously underrepresented groups and the overlap between their identities create an “entangled histor[y]” that is told somewhere between fact and fiction to weave a more nuanced view on history (Wortham). The past for Wortham is not easy to understand, which is why she believes that a diverse chorus of voices needs to be heard: a similar melding of core and periphery languages must happen. Wortham equates this entanglement of histories to the scientific topic of quantum entanglement. She states that just as two particles may be interdependent on one another’s state, so too does the “fact” of history become interdependent on the people telling the stories and stories that are being told (Wortham). In other words, if a teller of historical “fact” is biased in any way, then the reality of history will become warped and entangled with that person’s biases. The teller of the internet’s history are algorithms, which are inherently biased by the fallible humans who created them.

Although Haraway and Wortham may be correct about the potential of the internet to be an equalizer, they fail to consider barriers to reaching this potential. Firstly, they disregard the physical inaccessibility that disproportionately affects people in developing countries. While oftentimes in Western pop culture the future is discussed as a distant concept like in the shiny, floating cars of the Jetsons, “[T]he future is already here; it just isn’t evenly distributed” (Chen & Wellman). Deeply ingrained racism, sexism, and classism have proliferated the global society. While in recent decades great strides have been made, because of the internet there is a widening split between the interior (white, rich, men) and the exterior (people of color, poor, women). The split between these social stratas is directly reflected in the previously mentioned definition of heteroglossia. While the information age has made internet accessibility commonplace in affluent Western countries, many developing countries cannot afford to enter the digital conversation. For example, in many developing African countries, internet access, while possible, is often too costly. And, even if a family can afford internet access, it is often of such low quality it is rendered functionally useless (Gillwald). The technology infrastructure is simply not adequate in these countries, and, thus, these periphery people are blocked from accessing the internet and the greater globe. Additionally, if there are pre-existing inequalities in developing nations then “these inequalities may increase as the Internet becomes more central for acquiring information about employment, health, education, and politics” (Chen & Wellman). The internet is exacerbating existing disjunctions between social strata. Due to the deep engrainment of the internet in Western society and the inability of people in less affluent countries to buy into the system, the digital divide is ever increasing. 

Even if marginalized groups, like women, had equal physical access to the internet, they would still suffer from societal stigmas which prohibit them from equally accessing the internet. According to Stein et al in their article, “Social Inequality,” in developing African nations where some families have internet access, the technology is “used almost entirely by the husband, and women, the majority of them illiterate housewives, lack opportunities for training in computer skills” (Stein et al). Not only is the internet serving to divide global communities, it is also separating local people based on gender; many social stratifications, which would be made irrelevant in Haraway’s vision of a shining future, are continuing to bar people from entering this “post-modern” world. How are we to reconcile the incongruities of Haraway’s dream of a post-gender world in which the cyborg usurps existing societal constructs? Women in rural areas, both in Africa and in America, are bound by “cultural restrictions on mobility, reduced income, and the frequent lack of relevance of computer technologies to their existence” which “exclude these women … from the information sector” (Stein et al). So, rather than a physical inability to access the internet, these women are discouraged by societal reservations for women to enter technological industries. The discrimination towards women shows that even 35 years after the publication of “A Cyborg Manifesto,” we still have not entered into the post-gender utopia imagined by Haraway. 

Societal stigmas prevent outsider groups from equally accessing the internet, but the development and execution of algorithms prevent them from equally interacting with the internet. The most basic internet infrastructure is biased against minorities because test groups are unrepresentative of the people using these technologies. As the author of “Algorithms of Oppression,” Safiya Noble, learned, Google’s autofill responses were shockingly racist and sexist; she saw the autofill responses for “why are black women so …?” being “angry”, “loud”, and “mean.” These obviously racist perceptions of black women are perpetuated by the racially skewed programing of the algorithms, which illuminates the Internet’s underlying alienation of black women. The biases that come from long-term societal oppression and socialization manifest themselves in ways ranging from the trivial to the dangerously racist. On the more trivial end of the spectrum (although not being able to access technology in the 21st century is not trivial), we can look at the discrepancy of effectiveness in facial recognition software between white and dark skins. For example, in Snapchat, certain filters fail to successfully register black people’s faces. This failure to identify black skin clearly results from the dominance of primarily white test subjects and lack of test subjects with dark skin. However, the facial recognition of black people can have nastier societal implications. A more nefarious example was when an AI judged an international beauty competition, and “out of 44 winners, nearly all were white, a handful were Asian, and only one had dark skin” (Levin). This asymmetrical assignment of beauty was caused by a variety of factors: the biased algorithms to recognize “beauty” in sample pictures, the lack of adequate numbers of dark skin sample photos for the algorithms to recognize, and the bias of the humans who wrote the algorithms.

The creators of algorithms, and thereby the algorithms they produce, are unrepresentative of the users because they are overwhelmingly white and male. According to a study by the National Urban League, due to the high concentration of white males in the tech industry and the low quantity of people of color, under 5% in the industry being black, the technologies that emerge from tech companies are inherently biased (Lockhart). If there are not people of color in the development of these emerging technologies, then how could the internet and related technologies be compatible with people of marginalized groups? The absence of representation of minority groups in the development, sampling, and execution of algorithms is emblematic of the unequal accessibility for outsider groups. Algorithms in particular leave little room for non-core peoples. These predictive algorithms do not allow for a deviation from the past, they do not leave room for change, and they reinforce history’s past racist views. Many people do not realize the prevalence of algorithms in daily life and how they influence the outcomes of major life choices. For example, algorithms are used to assign interest rates on home loans, and, unsurprisingly, neighborhoods with a majority of people of color will have, on average, higher interest rates (Noble). Rather than believe that the internet is a neutral entity, the ugly truth must be addressed: internet algorithms are pervasive in society and systematically biased against minority groups. 

So, the problem of internet inequality and division is two-pronged: a physical lack of access and a societally cultivated culture of oppression. Haraway and Wortham see an emancipatory future in technology, but they need to go a step further. Technology alone is not enough: there needs to be a reimagining of societal relationships and organizations. Instead of pretending that we live in a postmodern or post-gender world and ignoring what divides us, we should embrace the various intersectionalities of our identities to better develop technologies for the future. Rather than letting past data be the main mechanism of algorithms, we should make room in these programs for change. A two-pronged problem requires a two-pronged solution. Firstly, people in power need to invest in the development of internet infrastructure, both in rural areas of wealthy nations and developing countries to allow physical internet access. Secondly, and more challengingly, societal awareness must be raised by privileged groups about the pervasive and largely invisible issue of internet inaccess between genders, races, and classes to help raise up underprivileged groups. On an individual level, we must demand greater inclusion from corporations, especially in their hiring processes and technologies. On a societal level, cultures that traditionally oppress must foster a more inclusive community that encourages people of all creeds, shapes, and colors to participate and be involved in technological development. Finally, all of us must not accept the internet and algorithms at face value, but, instead, look past the code for greater societal empathy and understanding. 

Works Cited 

Angwin, Julia, et al. “Machine Bias.” ProPublica, 9 Mar. 2019, www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. 

Chen, Wenhong, and Wellman, Barry. “Minding the Cyber-gap: the Internet and Social Inequality.” Blackwell Companions to Sociology: The Blackwell Companion to Social Inequalities edited by Mary Romero, and Eric Margolis, Blackwell Publishers, 1st edition, 2005. Credo Reference, http://proxy.library.nyu.edu/login?url=https://search.credoreference.com/content/entry/bkcsi/minding_the_cyber_gap_the_internet_and_social_inequality/0?institutionId=577.Accessed 7 Apr. 2019. 

Gorski, Paul C. “Digital Equity.” Encyclopedia of Activism and Social Justice, edited by Gary L. Anderson, Sage 

Publications, 1st edition, 2007. Credo Reference, http://proxy.library.nyu.edu/login?url=https://search.credoreference.com/content/entry/sageact/digitalequity/0?institutionId=577. Accessed 18 Apr. 2019. 

Haraway, Donna. “A Cyborg Manifesto: Science, Technology, and Socialist-Feminism in the Late Twentieth Century.” Simians, Cyborgs and Women: The Reinvention of Nature. New York: Routledge, 1991.

Levin, Sam. “A Beauty Contest Was Judged by AI and the Robots Didn’t like Dark Skin.” The Guardian, Guardian News and Media, 8 Sept. 2016, www.theguardian.com/technology/2016/sep/08/artificial-intelligence-beautycontest-doesnt-like-black-people. 

Lockhart, P.R. “The Digital Revolution Is Leaving Black People Behind.” Vox, Vox, 4 May 2018, www.vox.com/technology/2018/5/4/17318522/state-of-black-america-2018-national-urba n-league-silicon-valley-race. 

Noble, Safiya U. “Algorithms of Oppression.” NYU Press, Feb. 2018, nyupress.org/9781479837243/algorithms-of-oppression/.

Stein, Richard A., and Ionescu, Ana-Cristina. “Social Inequality.” The SAGE Encyclopedia of World Poverty, edited by Mehmet A. Odekon, Sage Publications, 2nd edition, 2015. Credo Reference, http://proxy.library.nyu.edu/login?url=https://search.credoreference.com/content/entry/sagewpova/social_inequality/0?institutionId=577. Accessed 7 Apr. 2019. 

Taylor, Victor E., and Charles E. Winquist. Encyclopedia of Postmodernism. Routledge, 2002. EBSCOhost, search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=70607&site=eds-live.

Wortham, Jenna. “How an Archive of the Internet Could Change History.” The New York Times, 21 June 2016, 

Written by hundredriver