Quantcast

Nassau Standard

Sunday, December 22, 2024

Examining How Gender Bias Is Built Into AI

23

Business | Pexels by fauxels

Business | Pexels by fauxels

Over the past two decades, devices that make use of artificial intelligence (AI) have infiltrated our existence.

It began with computers and smartphones, then digital voice assistants like Amazon’s Alexa and Apple’s Siri, and now applications like ChatGPT that generate content on topics as broad as the human imagination.

With AI being so ubiquitous in our lives, Subadra Panchanadeswaran, PhD, professor in the Adelphi University School of Social Work, and her colleague Ardra Manasi of the Center for Women’s Global Leadership at Rutgers University, along with Adelphi social work doctoral student Seung Ju Lee and Rutgers University graduate student Emily Sours, were curious about the impact of AI’s pervasiveness on society and how gender figures into this paradigm. The four co-authors published their commentary on this topic, “Mirroring the bias: gender and artificial intelligence,” in the journal Gender, Technology and Development.

“AI devices are all around us—from our homes and workplaces to our pockets and purses as we travel from place to place,” Dr. Panchanadeswaran said. “The four of us who worked on this project are passionate about working on and researching issues relating to gender equity, which ignited our interest in learning more about the AI landscape from a feminist perspective.”

The authors delved into the interplay of gender and the AI used by virtual assistants and robots in developing this commentary, with the goal of inspiring conversations on how societal biases are baked into AI applications that continue to be developed and incorporated into our lives.

Bias In, Bias Out

AI operates through algorithms created by humans, written in programming languages and code but with the potential to reflect the beliefs or biases of the programmer. The authors state that “largely, decision-making algorithms in AI are influenced by the kind of data that gets inputted.” Therefore, data bias can occur due to “subjective choices made when selecting, collecting and preparing data.”

According to Manasi and Dr. Panchanadeswaran, examples of this data bias include using a data set that lacks diversity and information on certain demographic categories like women of color, which might skew results. They also say that research has repeatedly shown that AI models are often trained on male-centric data, which in turn yields results that misidentify women and people of color.

The roots of biases found in AI programming are commanding attention. The 2020 Global Dialogue on Gender Equality and Artificial Intelligence organized by the United Nations Educational, Scientific and Cultural Organization (UNESCO) highlighted the need to address the issue.

“We need more normative frameworks that take on the ethical consequences of using AI,” Manasi and Dr. Panchanadeswaran agreed. “There is a huge policy gap in developing instruments or principles that successfully address gender equality. Different actors need to come together and explore joint solutions—corporate entities, tech companies, international organizations like the U.N., and civil society organizations will need to address this issue head on, or the bias will only grow.”

Hey, Siri!

Since the launch of Apple’s Siri in 2011 followed by Amazon Alexa and Microsoft’s Cortana in 2014, billions of digital voice assistants have become fixtures in homes worldwide. Through voice commands, users can ask these devices to complete tasks from playing music and checking the weather to ordering household supplies.

The authors note that most virtual assistants carry a female gendered name and feminine voices, as well as submissive personalities.

“The use of female names is another way to feminize and domesticate AI,” Manasi and Dr. Panchanadeswaran said. “This form of gendering comes with certain expectations and stereotypes, and has the potential to perpetuate existing gender inequalities and biases in society.”

They added, “The virtual assistant Siri’s name comes from Nordic, which translates to ‘beautiful woman who leads you to victory.’ This is a clear form of objectification that is being normalized.”

The authors explore these devices as perpetuating gendered divisions of labor that associate women with “affective labor”—work that involves producing, managing or modifying emotions in people—and tasks like taking notes and managing calendars. They also note the growing “robotization” of fields traditionally seen as women’s work, including hospitality and tourism, retail, healthcare and education.

Manasi and Dr. Panchanadeswaran said that advanced robotics is projected to contribute and add trillions of dollars to the global economic output by 2030. On this topic they noted, “While acknowledging its economic contribution, it is important to consider the ethical concerns that arise from their use.”

Looking Ahead

Despite the concerns related to built-in gender, race and ethnicity biases in AI, there is growing consciousness about not only avoiding these pitfalls but using AI to break gender norms.

Dr. Panchanadeswaran does offer some examples of how AI is being used to address existing biases and gender inequalities. “AI-powered gender decoders can be used in the hiring process by eliminating gender bias in job descriptions,” she said. “Apple and Google also developed alternative solutions to female voices in their applications, with the aim of diversification and neutrality.”

She cites the importance of identifying potential biases during the development of algorithms and in the training of data sets during AI-generated decision-making, as well as in the data used by algorithms.

More women and diverse individuals working in the field of AI would likely also have a positive impact on bias reduction.

“UNESCO data shows that only 12 percent of AI researchers are women, and they represent only 6 percent of software developers. To combat bias in AI, there must be a conscious, deliberate effort to ensure not only that more women enter the field, but that data represents the diversity of our population,” Manasi and Dr. Panchanadeswaran concurred.

Original source can be found here.

MORE NEWS