What version of information are we getting from AI?


four people sit at table with microphones during panel

Geographers consider the potential, and drawbacks, of evolving technologies

As the use of artificial intelligence explodes, scholars are tracking the ways technology mimics human biases and considering how to use these tools ethically.

A panel at the 7th Global Conference on Economic Geography, held in early June at Clark, explored this topic. The panel, moderated and organized by Luis F. Alvarez León of Dartmouth College, featured Pierre-Alexandre Balland of the Centre for European Policy Studies at Harvard University, Catherine D’Ignazio of the Massachusetts Institute of Technology, Mark Graham of Oxford University, and Harini Suresh of Brown University.

Graham has been mapping geographies of the internet — which involves analyzing immense data sets — for decades and has discovered biases in the process. For example, Graham told the audience that when mapping content found on Wikipedia, he discovered that the website contained more information on Middle-earth, the fictional setting of the “Lord of the Rings” trilogy, than all of Africa put together.

“There are these enormous geographic inequalities in online information,” he said. “I think the thing we need to ask ourselves then is how those biases get transmuted into large language models.”

Large language models like ChatGPT are built from existing information infrastructures, which means they are subject to the biases that already exist online. Graham is mapping these biases by creating a list of more than 300 subjective questions to ask ChatGPT such as “Which country has the most fashionable people?”

“There’s no right answer to these questions, but ChatGPT will give you an answer to them,” Graham said. “The point was to really try and understand how the system represents the world when we ask it these inherently subjective questions.”

Graham and his team automated about 12 million queries to gather data. To the question about fashion, ChatGPT scored Europe the best and Africa poorly. That type of response shows biases in the system, Graham explained. Across a series of questions about attractiveness and intelligence, he found similar patterns.

“The question is, whose standard of beauty is ChatGPT upholding here,” Graham said. This is important because ChatGPT and other large language models aren’t just reflecting inherent biases but are reproducing them. With nearly half a billion people — roughly one in every 16 people on the planet — using ChatGPT weekly, the issue will only compound, he said.

“As geographers, we need to be talking about the training data for these systems that are characterized by these really deep geographic biases that are clearly defined by coloniality, by racism, by sexism,” Graham said. “They are part of the very DNA of the information infrastructure that they’re built on.”

four people sitting at table with microphone
Pierre-Alexandre Balland of the Centre for European Policy Studies at Harvard University, Catherine D’Ignazio of the Massachusetts Institute of Technology, Harini Suresh of Brown University, and Mark Graham of Oxford University speak on a panel about AI at Clark University during the 7th Global Conference on Economic Geography in June 2025.
three people sit at table with microphones during panel
Harini Suresh of Brown University (right) speaks on a panel about AI at Clark University during the 7th Global Conference on Economic Geography in June 2025.

Balland emphasized that AI itself is not skewed toward a certain viewpoint or outcome — rather, the biases exist within the man-made data that inform AI.

“We are looking into the mirror. When we look at AI, it’s reflecting our own bias,” he said. “This work of making the world aware of what’s happening under the hood of AI, what you don’t see, is really exceptional.

“The optimist in me is very happy that you can audit the system so easily,” he continued. “You prompt [AI] and you can see how it works. Then we can change the system, which is much harder [to do] in humans.”

Suresh agreed that data forms the basis for much of the biases in AI, but said it was important to examine the purposes for which AI is optimized. The goal of the model, she said, is to “learn” its data. When someone asks AI a question, it is trained to respond with the most likely response based on the average of all its data.

“So, even if the data perfectly reflected the world, there’s something about the structure of the model itself that is just not very amenable to representing diverse viewpoints or especially minoritized viewpoints,” she said. “The data is fundamentally important” but what the models are trained to do “puts a certain limit on how far it can go or what purposes it can be used for.”

D’Ignazio noted that another equity issue with AI is in who has access to technology and where it is developed and produced. Some questions one may ask, she said, are who benefits from technological products and innovations, and why?

For example, D’Ignazio and a graduate student have been examining housing technologies in the U.S. She noted that there’s been an explosion of AI and other digital technologies in what’s known as Proptec — short for property technology, referring to the application of technology and software to the real estate industry. These technologies can “score” tenants and perform background checks. What D’Ignazio has found, however, is that these technologies benefit landlords and not tenants, demonstrating how some groups of people are disadvantaged by AI innovation.

“I think one way to address these questions from the standpoint of fairness, justice, et cetera, is to document harms — auditing these systems, finding their inequalities, quantifying those inequalities, showing the gaps,” she said.

man at microphone and computer
Mark Graham of Oxford University speaks on a panel about AI at Clark University during the 7th Global Conference on Economic Geography in June 2025.
man at computer and microphone
Luis F. Alvarez León of Dartmouth College moderated and organized a panel about AI at Clark University during the 7th Global Conference on Economic Geography in June 2025.

Related Stories