AI: Hey Alexa! – liberalism = social media malaise = “i can’t abandon facebook because it connects me to… = no chance of real change = locked in to liberalism’s ‘self-destruct’ laziness = ‘power’ in the foucauldian sense = the few control the outcome for everyone else

wow! the brilliance of the paradoxical contradictions here are truly epic!

What should people know about how AI products are made?
We aren’t used to thinking about these systems in terms of the environmental costs. But saying, “Hey, Alexa, order me some toilet rolls,” invokes into being this chain of extraction, which goes all around the planet… We’ve got a long way to go before this is green technology. Also, systems might seem automated but when we pull away the curtain we see large amounts of low paid labour, everything from crowd work categorising data to the never-ending toil of shuffling Amazon boxes. AI is neither artificial nor intelligent. It is made from natural resources and it is people who are performing the tasks to make the systems appear autonomous.

Problems of bias have been well documented in AI technology. Can more data solve that?
Bias is too narrow a term for the sorts of problems we’re talking about. Time and again, we see these systems producing errors – women offered less credit by credit-worthiness algorithms, black faces mislabelled – and the response has been: “We just need more data.” But I’ve tried to look at these deeper logics of classification and you start to see forms of discrimination, not just when systems are applied, but in how they are built and trained to see the world. Training datasets used for machine learning software thatcasually categorise people into just one of two genders; that label people according to their skin colour into one of five racial categories, and which attempt, based on how people look, to assign moral or ethical character. The idea that you can make these determinations based on appearance has a dark past and unfortunately the politics of classification has become baked into the substrates of AI.

as with ‘climate change’ – the false liberal scenario is that ‘technology’ will safe us. but, ‘technology’ is massively a cause of human, and non-human, natural world, demise.

capitalism and ‘save the planet’ are fundamentally, opposed. it’s not possible to have both. period.

therefore: the planet, and humans, are doomed. period.

so, facebook away, world!

while you still have time.

add more confessional crap that no one cares about except alexa to the pile of detritus like plastic at the bottom of the mariana trench and the top of everest! to, in fact, the sperm and ovum banks. to our very pores – we sweat plastic…

oh… and how the image/word worlds are absolutely complicit in producing our sweaty plastic labor…

You single out ImageNet, a large, publicly available training dataset for object recognition…
Consisting of around 14m images in more than 20,000 categories, ImageNet is one of the most significant training datasets in the history of machine learning. It is used to test the efficiency of object recognition algorithms. It was launched in 2009 by a set of Stanford researchers who scraped enormous amounts of images from the web and had crowd workers label them according to the nouns from WordNet, a lexical database that was created in the 1980s.

Beginning in 2017, I did a project with artist Trevor Paglen to look at how people were being labelled. We found horrifying classificatory terms that were misogynist, racist, ableist, and judgmental in the extreme. Pictures of people were being matched to words like kleptomaniac, alcoholic, bad person, closet queen, call girl, slut, drug addict and far more I cannot say here. ImageNet has now removed many of the obviously problematic people categories – certainly an improvement – however, the problem persists because these training sets still circulate on torrent sites [where files are shared between peers].

And we could only study ImageNet because it is public. There are huge training datasets held by tech companies that are completely secret. They have pillaged images we have uploaded to photo-sharing services and social media platforms and turned them into private systems.

the political response should be: shut down ALL social media, now.

but that’s not happening, is it?

What do you mean when you say we need to focus less on the ethics of AI and more on power?
Ethics are necessary, but not sufficient. More helpful are questions such as, who benefits and who is harmed by this AI system? And does it put power in the hands of the already powerful? What we see time and again, from facial recognition to tracking and surveillance in workplaces, is these systems are empowering already powerful institutions – corporations, militaries and police.

so, even this critic is hopelessly, hopeful…

What’s needed to make things better?
Much stronger regulatory regimes and greater rigour and responsibility around how training datasets are constructed. We also need different voices in these debates – including people who are seeing and living with the downsides of these systems. And we need a renewed politics of refusal that challenges the narrative that just because a technology can be built it should be deployed.

And giving me as much optimism as the progress on regulation is the work of activists agitating for change.


that can’t happen, will not happen, because ‘facebook addicts’ will not become anti-liberal = anti-themselves.

AI: Hey Alexa! – liberalism = social media malaise = “i can’t abandon facebook because it connects me to… = no chance of real change = locked in to liberalism’s ‘self-destruct’ laziness = ‘power’ in the foucauldian sense = the few control the outcome for everyone else

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s