Milne, Gemma, Smoke & Mirrors: How Hype Obscures the Future and How to See Past It

by

in ,

No one wants to read a boring story. When it comes to science and technology, narratives about new developments are readily subjected to embellishment and hype. The explosion in recent times surrounding Open AI’s Chat GPT, for instance, is just one story where the narrative is that AI will take our jobs, our creativity and our freedoms and it makes for a compelling read.

To a certain extent, these narratives are not entirely beyond the realms of possibility. AI poses many threats but the ‘hype’ which accompanies the headlines may feel somewhat overblown where the intention is less about public information and instead resembles flamboyant promotion, distracting and misinforming the reader. Naturally, such storytelling reflects the self-interest of the storyteller, innovators may defend this as a necessary and even positive condition for innovation in order to grab the public’s attention and to sell a product. But some storytellers (and stories) are left out or silenced for taking a nuanced position or having something a little more ‘everyday’ to say. In an attention economy, this issue is becoming more and more prevalent and can be deleterious, distracting public understanding from the current state of science of technology.

Such is the focus of Milne’s book which is a practical and accessible read about the dangers of hype in science communication. It is a well-written and well-researched book exploring a range of topics including the hype surrounding food production, cancer, the future of batteries, fusion energy, space travel, quantum computing, computer-brain interfaces, AI and astrobiology. It takes a critical look at each of these domains, the associated hype and what (and who) is driving and responsible for innovation.

She begins by explaining the title of the book ‘Smoke and Mirrors’ and introduces the notion of hype in a sophisticated, accessible way. The title implies a level of trickery and playfulness, setting the tone that hype is not necessarily always a bad thing under certain conditions when the purpose is to get attention. The use of journalistic tricks is perhaps justified in order to get across complex issues. In this book, Milne uses her own devices to draw us in. It has a pragmatic feel – a framing which seems to reflect Milne’s previous career in marketing and journalism married with a focus on responsible innovation in science and technology.

I will focus on the Chapter on AI. Stories about AI often present binary positions about imaginaries of AI. On the one hand, AI is subjected to a kind of myopic bright-siding, on the other, there is the fear narrative, where AI is seizing control, diminishing what it is to be human. Milne’s book importantly provokes questions about intent and asks when is it ok to use hype and when is it harmful?

In this chapter, we are first introduced to the issue of how Boston Dynamic’s autonomous robot dogs (known as ‘Spot’) are climbing the stairs ‘leaping between boxes laid out across an obstacle course’ (p.226). Milne describes how these robot dogs are ‘increasingly and eerily life-like’ (p.226) and reminds the reader of how this belies the fact that they are actually coded (and choreographed) by (human) hand. While these dogs can spot site health or potentially hazardous situations in research labs or construction sites, they are not all they are hyped up to be ‘they aren’t doing these tricks of their own volition’, says Milne, at least not yet. Milne stresses that the current technology is in fact narrow and limited and at least for now ‘humans are the ones in control.’ (p.227). What the public is presented, Milne reports, is the stuff of magic and voodoo. Through this example, Milne does a good job of dispelling myths and asserts that the important questions are being left unchecked, such as those relating to bias. The headlines choose to focus instead on our loss of control.

The most compelling argument made by Milne in this book beyond the focus on inflammatory headlines is that of the trade-offs society makes when engaging with technology.

Milne succinctly provides a basic overview of debates in AI ethics in just a few pages – we are introduced to discussions about bias and discriminatory data, important questions about trust, fairness, misuse and finally matters of power. An exploration follows about different approaches in machine learning such as the complex issue of cyber genetics, as well as algorithms in everyday use, such as recommendations from Amazon. In doing so, Milne opens up the ‘black box’ problem of transparency and explainability in AI. ‘Consenting is impossible if we don’t know how and why a decision was made’ (p.233), Milne writes, while stressing that the complex reality of AI requires collective exploration and interpretation.

Milne suggests that to dwell on these matters might be seen to be ‘succumbing to boredom’. A focus on the ethics of AI might not lead to exciting stories, they are nuanced and might be perceived as boring, but they are necessary.

Reading on, we take ‘a little detour into the wonderful world of moral philosophy’ (p.236). Milne explores thought experiments and the infamous, if a little overused, ethical paradox that is the ‘trolley problem’. There are other philosophical frameworks which would be an interesting addition, such as Shannon Vallor’s virtue epistemology and what she terms ‘techno-moral futures’ and work on feminist relational ethics, but as a starting point to simplifying the issue, the point is well made and we are reminded that without proper regulation, AI can be developed and deployed in a way that undermines human rights and democracy.

What follows are more in-depth examples of the embeddedness of AI systems in society in the everyday. Questions of blame and expectation occupy the thread of Milne’s self-confessed ‘dinner-party philosophy’ (p.236) along with an emphasis on the role of civil society in mobilising the changes we need to see in technology. The theme of responsibility is strong in the chapter and Milne is quick to shatter narratives of blame towards systems and machines. She draws the analogy ‘we need to remember that we are Sherlock, and that AI is Watson – only assisting, not driving, and never the one to blame’ (p.243).

She writes ‘we make a villain of the tech as it’s too hard to compute how this stuff could be done by humans who care’ (p.246). Milne’s message is urgent – attention should be paid to the language used to describe the current state of the technology and to ask difficult ethical questions to ‘add insight and diversify the data that those deep in the build may be missing, whether deliberately or not’. (p.252). It is often said that AI is a ‘mirror to ourselves’. At the moment it seems that most of what is talked about is merely smoke.

Milne’s emphasis on the role of hype is that it is not useless, but that it is something which can be dangerous, problematic or self-fulfilling if we assume a role of indifference to it. The book provokes the reader to consider the role of society in AI development and to empower the public. This makes the book a valuable contribution to the field of science and technology studies by providing an accessible entry point for anyone interested in the role of hype in science communication.

The book is at its best when it calls for collective action to reflect and seek nuanced stories about science and technology. The section covering cancer, for instance insists that while sequencing ‘is a brilliant innovation… there are flaws which rarely get any coverage, meaning it is often presented without crucial nuance’ (p.51).

Smoke and Mirrors encourages the reader to ‘take the blinkers off’ (p.227) and to engage with the way AI (and other aspects of science and technology) is publically portrayed. Perhaps a reader might feel that it is not their place to question it and never will be. After all, technology is in the hands of the powerful and privileged. This book may go some way towards challenging some of those beliefs and opening up broader considerations beyond what stories are trending. As Milne remarks ‘it is not good for us’ (p.252). I am inclined to agree.

Jennifer Chubb, University of York

 

Author

css.php