'Code Dependent': Madhumita Murgia reveals how AI is targeting the marginalised

Her new book on AI has been shortlisted for Women's Prize for Non-Fiction

68-Madhumita-Murgia Decoding ai: Madhumita Murgia.

Over a decade ago, when Madhumita Murgia was starting out in journalism (she had studied biology and clinical immunology at Oxford and then worked on developing vaccines for HIV), she became curious about the world of data brokering—where shadowy companies “collect data about our online lives and turn them into saleable profiles of who we are today, and who we will one day become”. To make the story more tangible, she decided to track down the profile of someone she was intimately familiar with—herself. She contacted an ad-tech startup to decode the information collected about her from her own web browser. The report they sent her of an ‘anonymised version' of herself included her profile put together by Experian, a credit-rating agency that doubled as a data broker.

Murgia is deeply empathetic—and perhaps a tad angry—for those disempowered by AI technology and at the mercy of those profiting from it.
“In my stories, what was hurting people was not the flaws in the technology necessarily, but the human failures in how we implement it.” —Madhumita Murgia

The report was shocking, not just because of the personal details it contained—from where she worked and lived and how she spent her money to the holidays she had taken in the past year—but also because it detailed her opinions, interests, and personality traits, from her TV-watching habits and food preferences to her level of ambition and political leanings.

This was really her baptism into the murky world of data colonialism, big tech, surveillance, and artificial intelligence. “My life—and yours—is being converted into such a data package that is then sold on,” she writes in her new book, Code Dependent: Living in the Shadow of AI. “Ultimately, we are the products.”

One of Murgia’s greatest strengths—who currently works as the AI editor of Financial Times—is that she has a skill for humanising her stories. So, if she told the story of data brokering through her own life, she tells the story of AI through the lives of ordinary people at the ‘back end’ of the technology. These are usually the individuals and communities that are already “othered, floating in society’s blurry edges, fighting to be seen and heard”, whether they be women, black and brown people, migrants and refugees, religious minorities, the poor or the disabled.

So her story careens from the lives of data labourers in Kenya (those who help train AI software by tagging and labelling big data-sets); to the victims of nude deepfakes in the UK (or hyper-real fake images and videos of a person created through AI) uploaded on pornographic websites; to minorities and dalits in the slums of Hyderabad monitored via facial recognition technology; to teenagers in Amsterdam listed by the government as potential future criminals with the help of machine-learning algorithms.

When we think of AI, we think of tech geniuses like OpenAI’s Sam Altman and Meta's Mark Zuckerberg sitting in the air-conditioned confines of Silicon Valley and working towards a future where artificial intelligence percolates everything—from health care to robotics to farming to retail. We hardly think of people like Hiba—a data labourer for a Bulgarian startup called Humans in the Loop, (whose workers are primarily refugees and migrants from the Middle East who have been displaced by political conflict and war). For each image she tags, Hiba is paid the equivalent of 60 cents. She makes a minimum of four euros per hour of work. She has trained her family also to tag data for AI companies in the west, and between them, they earn anywhere from $600 to $1,200 a month. Their expenses are roughly $1,600 a month, so they just about make ends meet by supplementing their earnings through helping run a nearby beauty salon.

Print

“Giving people work is not charity,” Murgia quotes a Kenyan lawyer in the book. “It is not enough to simply pay people. You may be lifting them out of poverty, while still disenfranchising them and treating them as ‘pawns with no agency’.” The irony is that these people are working on something that might ultimately put them out of work, and they are not even aware of it. Hiba, for example, has no views on the impact of what she is helping to build. She just wants steady work.

Code Dependent is gripping because it shows us an aspect of AI that very few of us grapple with—away from flashy driverless cars and chatbot therapists. Murgia is deeply empathetic—and perhaps a tad angry—for those disempowered by this technology and at the mercy of those profiting from it. And that is why she reiterates the importance of accountability.

“One of the things that struck me [while researching this book] was that I had written a lot about how AI as a technology can be flawed,” she tells THE WEEK. “It is a statistical predictive system so it is never going to be 100 per cent correct. And it makes errors all the time. But what was surprising to me was that in the stories I tell, what was hurting people was not the flaws in the technology necessarily, but the human failures in how we implement it. Like trusting AI too much and therefore not building any backups for when it goes wrong. Or the lack of accountability in introducing a system like facial recognition or a criminal justice prediction algorithm.”

Murgia gives the example of a computer scientist she cites in the book, who took on some work with Uber Eats. What he learnt was that there was no human he could get to at the other end to show what the technology was not getting right and how to correct it. So that failure turns up again and again when we trust AI too much, says Murgia. We don’t build guardrails for when it makes errors.

And then there is of course the question that she must have fielded time and again since becoming an AI editor. Does she think AI is going to become sentient? Murgia smiles in response. No, she says. She does not think so. However, when people interact with ChatGPT, they tend to humanise or anthropomorphise it, she says. They tell it secrets they don’t even tell their friends or family. And so, currently, it is a lot more about us viewing it as sentient, and what that means for our society, than it actually becoming sentient. “If we start to treat this as human-adjacent in some way, how is this going to change our relationship with each other and how we function as a society?” she asks.

Code Dependent: Living in the Shadow of AI

By Madhumita Murgia

Published by Picador India

Price Rs699; pages 336

TAGS