Beyond the Stereotypes of AI
Microsoft’s Kate Crawford was interviewed by the Guardian, and gave an insider look into the process of AI integration into our lives and whether or not sustainability and the automatic processes we feel to be real are actually the opposite.
“We aren’t used to thinking about these systems in terms of the environmental costs. But saying, “Hey, Alexa, order me some toilet rolls,” invokes into being this chain of extraction, which goes all around the planet...
Also, systems might seem automated but when we pull away the curtain we see large amounts of low paid labour, everything from crowd work categorising data to the never-ending toil of shuffling Amazon boxes.
AI is neither artificial nor intelligent. It is made from natural resources and it is people who are performing the tasks to make the systems appear autonomous.”
The issue, however, runs deeper than simply underpaid workers, the systems of processing themselves are corrupted and the solution sounds like it lies in having more data would eventually solve this issue, but actually this perspective accentuates the system being discriminatory in itself, Ms. Crawford states best
“...I’ve tried to look at these deeper logics of classification and you start to see forms of discrimination, not just when systems are applied, but in how they are built and trained to see the world.”
She went on to describe these forms of discriminations as sort of categorizing boxes that we teach our machines to put people into them, such as simply two genders, five races, that go on to assume based on the way you look, your age and gender, your moral or ethnic character. These assumptions of character based on race are unfortunately following a dark history of politics in classification that substrates into AI.
Later in her interview Ms. Crawford sites specifically ImageNet, one of the largest database systems used to train algorithms and what she found from her studies with Trevor Paglen was that “Pictures of people were being matched to worlds like kleptomaniac, alcoholic, ... slut, drug addict and far more I cannot say here.” Since them ImageNet has been removed for many obvious problematic reasons, but Ms. Crawford states “There are huge training datasets helf by tech companies thata are completely secret.” a worrying theme of AI.