Library Lecture Explores Artificial Intelligence in DEI

Artificial Intelligence (AI) scholars try to soothe skeptics by saying that AI is “just math,” not a heartless automaton born of science fiction. We need not be afraid of math, the argument goes, we just need to hire diverse data scientists to develop better, less biased models.

For Matthew Salzano, the solution isn’t so simple.

Salzano, a researcher who explores the intersections of social identities and digital technology, is an IDEA Fellow in Ethical AI, Information Systems, and Data Science and Literacy applied to Complex Structures and Networks. He holds a joint appointment with the School of Communication and Journalism/Alda Center for Communicating Science and the Program in Writing and Rhetoric. His recent Stony Brook University Libraries talk, “AI in DEI: Thinking Beyond Bias,” explored the concepts of technochauvinism and technoliberalism, offering findings from his research about AI and communication, and practices to reckon with diversity, equity, inclusion and accessibility (DEIA) and AI.

‘Technochauvinism’, a term coined by NYU data journalism professor Meredith Broussard, describes ways of thinking about digital technologies which involve the belief that they are the solution to a wide range of social problems. However, these “solutions” also exhibit blind optimism about the transformational power of digital technologies and lack concern for their impact. Additionally, they display the white male bias that’s pervasive to technology in general.

“Technochauvinism is clearly embedded in these systems,” said Salzano. “First, there’s the assumption that the technological fix is always the best thing and that ‘we’ll be OK if we just fix the system.’ The second thing is that it assumes the people who should ‘fix’ societal issues like racism are technologists without any broader, collective reckoning with the failings of systemic inequality. Technochauvinism like this is everywhere.”

Salzano cited an example from a May 2023 report from the US Office of Educational Technology that said many workers may eventually use AI assistants to make their jobs easier, and teachers are most deserving of efforts to ease their jobs.

“I completely support making teachers’ jobs easier, but we already know another way we can help with this: hiring more teachers and teacher’s aides,” Salzano said. “AI assistants aren’t the only option. But that’s exactly the sort of technochauvinist response that assumes the best way to solve the problem is with computation. You also see this in the mission statements of big tech firms that see themselves as sort of the ‘saviors of humankind.’”

So how did technochauvinism become so powerful and pervasive?

“It’s a feature of a bigger political, economic governing rationality that I and others have theorized as ‘technoliberalism,’” he said, adding that “technoliberalism takes the worst of neoliberalism and makes it even worse by computationally intensifying it.”

“Now it’s not just the market that is used as the rubric for all societal and political needs,” he said, “it’s the digital market.”

Salzano said technoliberalism is made possible by a shift in how capitalism works – a new economic system where human experience is the raw material.

“Social media companies collect vast amounts of data on our behavior, preferences and thoughts,” he said. “That’s why they’re all free, because they’re getting all sorts of information from us.”

Salzano said these companies then sell that data to what Katherine Johnston, assistant professor in the Department of English and the Program in Writing and Rhetoric calls in her book Profiles and Plotlines: Data Surveillance in Twenty-first Century Literature the ‘profile industry.’

Consumer data is sold so marketers can predict future actions and influence behavior with the goal of developing profitable consumers. But it’s deeply intertwined with racism and underlying disparities and injustices that “evaluate people against a disproportionately white, male, able-bodied, middle class or wealthy US citizen norm that is depicted as universal and unbiased,” said Salzano, quoting from a Data for Black Lives report.

That report said what’s at stake is not simply consumer privacy or a question of who sees which digital ad for a pair of shoes, but core questions of self-determination. “Technoliberalism describes the governing rationality that justifies this shift in capitalism, which in turn holds up systems like hetero-patriarchy and white supremacy,” said Salzano. “It directs our attention towards specific computational solutions for specific market-determined problems, while presenting itself as universal, as if it were just common sense.”

The AI Empire Tree, a graphic created by Syracuse University researchers, illustrates the interlocking systems of oppression in generative AI’s global order.

“This ‘global order is rooted in hetero-patriarchy, racial capitalism, white supremacy and coloniality’ that extends from pre-existing systems of power, but specifically makes AI technology possible and dominant through ‘mechanisms of extractivism, automation, essentialism, surveillance and containment,’” Salzano said, quoting the research.

Salzano cited the AI empire tree to suggest that DEIA practice must go ‘beyond’ bias to address the ‘roots’ of inequality.

“When we think about AI bias, it’s like looking only at the fruit that grows off these branches, like technological apartheid between big tech and the people subject to the tools big tech deploys,” he said. “The actual tree gets disguised. You see the ‘leaves,’ but not the flawed processes that lead to the branches.”

Salzano encouraged attendees to not ignore and minimize AI crises that arise, but rather embrace them and participate in them as much as possible.

“That could be just sending a screenshot onto social media or facilitating more serious argumentation regarding AI initiatives,” he said. “These small local modes of resistance may not feel satisfying while standing under the looming tree of AI Empire. But perhaps it is one practice that at least gets up the trunk a little, or maybe even plants a new seed that will grow and choke out the poison.”

— Robert Emproto