Feminist Technology: Data Bias, Artificial Intelligence, and How Computers Work Against Women
January 13, 2022 by Noor Amanullah
Every day we are more reliant on artificial intelligence, algorithms, and computers to read, analyze, compile, and categorize for us. In theory, this is a good thing: we remove the faulty human from miscounting, from expressing racist or sexist attitudes in a selection process, or missing a grammar mistake in the editing process. But what happens when the data that fuels an algorithm or a computer is itself biased? Artificial intelligence, the technology of the future, relies on data sets to become better, more efficient. But data, no matter how quantitative and accurate, can hide its own biases. In turn, technology reliant on this data inherits such bias, fostering discriminative technology. A popular example of this phenomenon is in the job hiring process. Large employers often use a computer to shift through piles of applications. Companies use artificial intelligence to screen resumes from potential candidates, using data meant to find the best suited and strongest individuals. While using AI might mitigate the bias a human exhibits upon seeing an ethnic or feminine name, it can hold bias toward the same individuals for other reasons. For example, societal and workplace discrimination have kept women and people of color out of specific industries and positions for centuries. Certain skills and jobs are associated with gender or race. Thus, the keywords and profiles associated with a business manager position at a large corporate office might lead a computer away from certain candidates. The pattern that the computer relies on is itself reliant on decades of racism and sexism. Companies currently enlarge this issue of biased data. Tech companies in particular are hesitant to disaggregate data on their employees by sex. Google hides gendered data, refusing to hand over information on pay imbalance when the US Department of Labor found “systemic compensation disparities against women pretty much across the entire workforce.” Denying complete data to the world allows technology companies, themselves built on data like Google, to hide current problems and further perpetuate data bias issues. Data bias exists in other ways as well. Take the gender pay gap, for example. Data shows that in many countries and companies, this gap is closing. Ireland is often claimed as the “best country to be a working woman,” where the pay gap is the smallest. The data used to make such claims, however, is inadequate. Data concerning women’s work is often not collected at all. Women work, but they are not always paid for their work. While gaps are accounted for in areas where women are paid for their hours spent at work, there is a deficit in data on women’s unpaid labor. Women do 75% of unpaid labor globally, outpacing men’s unpaid labor by one to six hours a day, beginning as early as the age of five. More often serving as caretakers of children and the elderly and being in charge of domestic and community work, women contribute significantly to economic growth and efficiency without being personally compensated. When the pay gap is calculated then, the data is insufficient, only accounting for the hours women work according to a payroll, not how they meet local and familial needs. Data bias is a safety issue as well. In the car industry, for example, a refusal to collect diverse, disaggregated data means that drivers and passengers rely on data collected from a reference based on 150 pound young white men. Women, people of color, pregnant people, and disabled people with genetic and physical differences to this reference cannot rely on seatbelts and airbags to protect them. This issue of the “reference man” spans numerous industries and issues, such as chemical safety standards that harm women significantly. The list of ways data bias harms most individuals is endless. Among the many concerning examples are law enforcement’s use of artificial intelligence to identify criminals and potentially charge and incarcerate the wrong people, as well as using inadequate data in public planning, leading to cities being built for able-bodied men and making daily tasks extremely difficult for women, disabled people, and caretakers, and especially for those in the intersection. Data bias is a pressing issue because it pervades every area of life, every part of the world, and the future. There is no doubt that artificial intelligence is the way of the future. Still, data bias issues need to be overcome before we continue building a world that is more inequitable and inaccessible than the one we already have.