Daily Archives: May 20, 2023

Book Review: More Than a Glitch

In Feminist Text Analysis this semester I had the opportunity to read “More Than a Glitch  – Confronting Race, Gender, and Ability Bias in Tech” by Meredith Broussard. I thought I knew a fair amount about bias in the technology space, but my eyes have been widened to problems I didn’t even know existed. Many think that the fix to these biases is that we must simply enhance technology as it is the only answer. This is what has been coined “technochauvinism”. This is akin to what Catherine D’Ignazio and Lauren Klein, authors of Data Feminism, call “Big Dick Data” . This concept explains that many think the more data the better and that it can never be wrong1. However, Broussard makes the argument that while technology, namely AI (Artificial Intelligence) can be helpful, it can also be detrimental to society. To promote equality and equity, the right answer to these issues may be to not use technology at all. She asks, “Why use inferior technology to replace capable humans when humans are doing a good job?”

Societal issues cannot be solved tech alone and even tech can deepen the issues as well. These issues or “glitches” within software, AI, etc come off as a simple fix but this may not be the case. Broussard explains this as the difference between social and technological fairness. To put it simple, computers are just “machines that can do math”2. While they can compute at a high level to produce an answer, they do not have feelings or experiences and therefore cannot be the entire solution to these highly complex problems. We can align this idea to the concept of “resistant reading” we spoke about this semester. What are the alternatives? What can we do to challenge the norm and provide better results for our communities? 

Humans build code, the code may contain faults. AI is therefore not a neutral technology. These faults are often at the expense of already marginalized groups. Examples mentioned but are not limited to are predictive policing, AI facial recognition software, Google searches, testing technology for schools, lack of accessibility, reinforcing gender binaries and even the use of automated soap dispensers.
The intersectionality between race, gender, and technological advances is the main theme when being critical over technologies. Building technologies that need, or say they need, to accept race or gender as a datapoint have traditionally been built as a boolean or a select one fixed list within a user interface. We know that gender, and even so called biological sex is socially constructed. Before this was in the cultural zeitgeist, people were not aware that you could change your gender, and many databases made these fields uneditable. These ideals still persist today in coding as well as in legacy systems. Programmers are taught to optimize code in order to save memory when building programs. A boolean is cheaper in memory than a string of text. The concept of elegant code is therefore enforcing the gender binary and promoting cis-heteronormativity. Even the biggest names in tech, like Microsoft and Google, that promote themselves as a LGBTQIA+ allies3 sometimes have trouble recognizing ze, hir, xie, etc. as acceptable words to use or yield no results in their dictionaries within their respective word processing softwares.

Race, medicine and technology is yet another example of where these glitches take place. As mentioned previously, many softwares only allow for a user to check off one race within a list when identifying themselves. However, multi-racial people exist! What are they to do? How are they supposed to identify in these scenarios? People don’t fit into the neat little boxes decided and created by software engineers. One example of this happening is with electronic medical records, EMRs. As soon as race is entered into these charts, the type of care that is often received is unfortunately linked to the color of someone’s skin. It is known that historically that any complaints of pain from Black women have often been ignored whether that is from conscious or unconscious bias4. Social factors are also at play here which is why so many more Black women die from birth related events than other groups5 and it’s not just from the prejudice of doctors. Not all technology works equally. Pulse oximeters, a very common device that measures a person’s oxygen rate, often give false readings to those with darker skin tones6. Why would the FDA or any governing body decide it’s ok to sell and distribute this tech? Most likely it wasn’t tested on these underserved populations therefore showing no issue in the compliance process. You can’t provide results for something that is never tested. The same can be said for AI technology. 

At its core, AI is a way to provide high level statistics. Data scientists train algorithmic models on datasets. From those datasets, the model is able to predict accuracy or probability for new data that it is fed. What happens when the training data is missing important information? The model will be flawed and potentially hurt those affected by it. As an example, Broussard even ran her own experience with breast cancer through an AI to see if she was able to detect it herself over a doctor. While she was able to, it took immense trial and error, required outside help and hundreds of hours to get the right answer. On the other hand, her doctor was able to tell her in minutes from looking at a simple scan. Sometimes, it just doesn’t make sense to use these predictive technologies to replace expert humans.

Finally, we need to be more careful in how AI is created and need to be transparent about how it works. AI is often described as a black box. Therefore Broussard suggests further governmental action as well as action by individuals. As a single person, it is also possible to be critical of these systems. Broussard calls this “Bullshit Detection”7. We can use the following three questions to be critical about AI or whatever software is being advertised to us.

Who is telling me this?

How do they know it is possible to achieve results?

What are they trying to sell me?

    Additionally, all tech companies need to be held accountable for their actions which is where algorithmic auditing comes in handy. Similar to accounting audits, there are organizations dedicated to understanding algorithms and providing feedback to companies for them to manage risk. Major players in that field include but are not limited to Cathy O’Neil and Julia Angwin.

    O’Neil is the author of Weapons of Math Destruction and founded ORCAA8, a more traditional auditing advisory company focusing on understanding big tech and algorithms in order to help them mitigate risk. Angwin founded The Markup9 which is a news organization dedicated to watching and investigating big tech. What I found most interesting about them is that they provide documentation to their readers on how to replicate their studies. This is exactly what is meant by increasing transparency in the tech space, especially for algorithmic issues.
    Ultimately, I agree with Broussard in challenging new technologies to make sure they are suited for the common good. She sums this up nicely by stating, “And if we must use inferior technology, let’s make sure to also have a parallel track of expert humans that is accessible to everyone regardless of economic means”10.


    Footnotes

    1  D’Ignazio, C., & Klein, L. (2020). 6. The Numbers Don’t Speak for Themselves. In Data Feminism. Retrieved from https://data-feminism.mitpress.mit.edu/pub/czq9dfs5

    2  Broussard, Meredith. More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech. MIT Press, 2023.

    3 https://unlocked.microsoft.com/pride/, https://pride.google/

    4 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4843483/

    5 https://www.cdc.gov/healthequity/features/maternal-mortality/index.html

    6 https://hms.harvard.edu/news/skin-tone-pulse-oximetry

    7 Broussard, Meredith. More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech. MIT Press, 2023.

    8 https://orcaarisk.com/

    9 https://themarkup.org/

    10 Broussard, Meredith. More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech. MIT Press, 2023.