ESR 2 Blog May 2021: Piu-Wai Chan

May is certainly a good month for good books. A lot of exciting reads came out and I found myself particularly enjoyed the one written by the social scientist and AI scholar, Kate Crawford. Most readers probably encountered her through ‘The Anatomy of an AI System’ (co-produced with Vladan Joler.) It is a mapping-driven case study of the Amazon Echo, which has been extensively exhibited in galleries and museums internationally.[1] However, it was never intended to be an ‘art project’, but a transdisciplinary investigation, consolidating ‘a big picture’ of the planetary network that is embedded within an AI system.  This transdisciplinary approach is exactly the backbone of Crawford’s new book— it ‘…embraces different stories, locations and knowledge bases to better understand the role of AI in the world.’, as a mean to counter ‘colonial mapping logics.’[2]

I first came across Crawford’s work through a now classic article, ‘Artificial Intelligence’s White Guy Problem’, featured in the New York Times in 2016.[3] It consolidated some unfortunately classic examples of what is now known as ‘algorithmic biases.’ (I would recommend checking out media theorist Florian Cramer’s ‘Crapularity Hermeneutics: Interpretation as the Blind Spot of Analytics, Artificial Intelligence and Other Algorithmic Producers of the Postapocalyptic Present,’ in which Cramer demonstrated how data is always qualitative in nature, with Crawford’s research.)

However, through Atlas of AI, Crawford argued that we must look deeper than just bias within a dataset (I would also like to add that Bias is also a quantifiable, statistical concept which can be ‘absorbed’ through various mathematical methods.) Crawford used the term ‘Statistical Ouroboros’ (a phrase which I am really fond of) to describe the inherent recursiveness within the ecology of AI systems and data mining. Hence, one must trace the trail of Epistemic Machinery (a concept from Karin Knorr Cetina’s Epistemic Cultures) and its operation across modern history, understanding how ‘pattern discrimination’ have been shaping society, which in turn shapes data, which again, shapes us via an increasingly ‘artificial’ environment (Such horror is most aptly demonstrated through Jeff Bezos’ notion of ‘artificial artificial intelligence’—a phrase he used to describe Amazon Mechanical Tuck.)

In short, AI simplifies everything for the sake of algorithmic ‘accuracy.’ As Bernard Stiegler puts it, we are heading toward an ‘epochless’ era — ‘Digital reticulation penetrates, invades, parasitizes and ultimately destroys social relations at lightning speed, and, in so doing, neutralizes and annihilates them from within, by outstripping, overtaking and engulfing them.’[4]

Computer Fun Fact 101: Do you know that AI science uses ‘epoch’ as a unit for each time a Machine Learning algorithm ‘learns’ a full training dataset ;)?

[1] See https://anatomyof.ai

[2] Kate Crawford, The Atlas of AI, Yale University Press, 2021

[3] See https://www.nytimes.com/2016/06/26/opinion/sunday/artificial-intelligences-white-guy-problem.html

[4] Bernard Stiegler, The Age of Disruption: Technology and Madness in Computational Capitalism, Polity Press, 2019

Privacy Preferences
When you visit our website, it may store information through your browser from specific services, usually in form of cookies. Here you can change your privacy preferences. Please note that blocking some types of cookies may impact your experience on our website and the services we offer.