Dystopian Sci-Fi and the Fear of Today and Tomorrow’s Technology

Article In The Thread
New America / Munimara on Shutterstock
Oct. 5, 2021

When Minority Report hit theaters in 2002, it was both the best and worst thing to happen to predictive policing technologies.

When algorithms intended to help police anticipate crime burst onto the scene more than a decade ago, people quickly began invoking the classic Tom Cruise sci-fi thriller to point out the dangers. It was an excellent example of what has become something of a cliche in the futures space: arguing that science fiction gives us a vocabulary to talk about policy around emerging technologies. Critics of predictive policing pointed to Minority Report as an example of how the technology could be flawed, how it could entrap the innocent.

But there was one problem: In Minority Report, the predictive policing system worked by magic, thanks to three people who lived in baths of water and somehow could see misty images of future crimes. In the real world, predictive policing was about data — often flawed data that reflected the biases of law enforcement’s past and present. In a magisterial 2016 report, ProPublica detailed how risk assessment software, intended to help courts during criminal sentencing, was biased against Black people. Algorithms intended to help law enforcement agencies identify particular neighborhoods where crime may be committed end up sending police to the same neighborhoods where they have always hyper focused their attention. But if you tried to use Minority Report to have a serious conversation about predictive policing, people might come away only thinking “that’s really creepy,” rather than thinking through how the technology works — the most important part.

It’s something that often happens when we try to use dystopias and fear to get people to engage in discourse about the future and technology. Chilling tales of tomorrow can scare people — but do they actually inspire useful action?

When I tell people that my job is the future — specifically, editing articles that look at the policy and social questions raised by emerging technologies and science — they often ask me whether I’m optimistic or pessimistic about where we’re headed. My answer is always: Yes, I’m both. The optimist in me says that talking about the future is the best way to make sure we end up with the one we want. The pessimist in me says that climate change, automating away jobs, the power of Big Tech, and the endless surveillance creep, all of it, is unquestionably terrifying if left unchecked. It’s the job of editors like me and people who write dystopian sci-fi to do that checking, to highlight the fears so that it might not come to pass. The pragmatist in me believes that when the future gets here, it’s just the present — it doesn’t feel strange and scary in that unsettling sci-fi way. I once edited a piece in which a futurist wrote that she had a professor who liked to say that once the future gets here, “It will feel like Tuesday.” And once a technology or practice “feels like Tuesday,” it’s harder to combat, because it feels inevitable.

But while trying to educate with sci-fi, we also need to recognize the danger here. If we scare people about the future too much, it might frighten them into complacency or stunned indecision. In a recent preprint paper (meaning it has not yet been subject to peer review) from the medical journal the Lancet, 10,000 people between the ages of 16 and 25 from 10 countries were surveyed about climate change. Seventy-five percent of respondents agreed with the statement, “Future is frightening.” The authors of the paper write, “Over 50% felt sad, anxious, angry, powerless, helpless, and guilty. Over 45% said their feelings about climate change negatively affected their daily life and functioning, and many reported a high number of negative thoughts about climate change.” (emphasis mine). These are reasonable feelings, particularly, as the paper notes, in the face of governmental inaction. But they also demonstrate that scaring people — particularly those who lack power because of their age, economic status, country of origin, gender, etc. — comes with its own set of dangers. 

We want people to be able to take part in the conversation about the big questions we all face going forward: how to live in a climate-changed world, how to hold Silicon Valley accountable, how to embrace biotechnology’s potential to reduce human suffering without getting into ethically dubious territory, and how to constrain government surveillance. Dystopias offer us endless visions of futures that have failed to address these challenges in a just way that listens to everyday citizens of the world: The Water Knife on climate change, The Circle on Silicon Valley, Gattaca on genetic editing, 1984 on surveillance. These are all stories that, like Minority Report, give us heavy moralizing about what’s “creepy” and therefore scary. But as Evan Selinger has written for Future Tense, creepy is a word that lets us dance around what is actually objectionable about a technology (or future). Warnings about creepiness are not enough to help us avoid the futures we don’t want. Fear can, as the Lancet preprint study suggests, make us freeze into inaction. Maybe it can even inspire us to live more in the moment than ever: I’ve certainly found myself thinking, “Might as well enjoy unlimited air conditioning while I can.” 

If we scare people about the future too much, it might frighten them into complacency or stunned indecision.

To try to avoid that kind of thinking — that a terrible future is inevitable — I have put limits on the kinds of invocation of dystopias that I allow to appear in articles I edit about the future. I try to avoid Minority Report because, once again, it doesn’t offer us tools to talk about what the problem is with the predictive policing it envisions. I try to limit invocations of Orwell and Big Brother, because they don’t quite capture the very particular questions of surveillance and privacy that we face today. The words “Orwell” and “Big Brother” have become so overused as to become meaningless, as people lob them at their political opponents. They’re so freighted with fear that they end discussion rather than encourage it. 

This is not to say that sci-fi has no value here — it certainly does. But we need to be careful to embrace visions of the future that can both scare and inspire, in similar fashion to Kim Stanley Robinson’s recent book Ministry for the Future. It opens with a horrifying image of a climate change-induced deadly heat wave — but then begins to imagine possible ways that intergovernmental efforts can make an actual difference. We must ensure that these dystopian visions of the future don’t scare us into inaction, but rather give us a chance to test the effects of technology and spur change.

You May Also Like

The Rise of Dismal Science Fiction (New America Weekly, 2018): Fear-based politics surrounding economics have become the basis for many mainstream forms of sci-fi media—these fictional issues can inflate anxiety levels among the general population, but might also equip us to solve our real-world issues.

Prototyping a Better Tomorrow (Open Technology Institute, 2017): Americans’ consumption of dystopian fiction as a coping mechanism can act as a roadmap that may help us avoid a dark future. But we can’t just rely on dystopian stories — such as The Handmaid’s Tale — we must use science fiction to envision a positive future.

Afrofuturism's Reimagined Tomorrows (Future Tense, 2021): In a time of growing awareness and rejection of systemic racism, imagining a future where Black people are surviving, thriving, and leading technological and social change through works of science fiction is more urgent than ever. Black Lives Matter, and so do Black futures—to all of our futures.


Follow The Thread! Subscribe to The Thread monthly newsletter to get the latest in policy, equity, and culture in your inbox the first Tuesday of each month.