logo

Dan Houser Novel Explores a Dark AI-Driven Future Beyond Gaming

Grand Theft Auto GTA

One of the masterminds who helped create the ground-breaking video games series Grand Theft Auto, Dan Houser.

Now, having left Rockstar Games to start his own company, he has released his debut novel — about a very different kind of game.

A Better Paradise This is a near-future dystopian of an AI computer game gone rogue.

It’s set in a polarised world where Mark Tyburn is trying to create a virtual refuge for people to take shelter and rediscover their equilibrium free of an all-consuming social media hellscape.

Unfortunately it all goes wrong when the network accidentally releases an algorithm that creates a mysterious and sentient AI bot called NigelDave into society – “a hyper-intelligence constructed by humans” (albeit with all of our flaws).

Readers see his internal struggle and thought process, as he attempts to come to terms with “infinite knowledge and zero wisdom”.

“What would the first words of a very, very precocious child who will never forget everything that he’s thought (because computers don’t do that)—what is it going to feel like when he just started speaking? Houser says.

Interestingly, A Better Paradise was sort of prophetic.

Originally published as a podcast, the book arrives at a time when AI’s growth continues to explode — the sector’s “big seven” companies are now worth more than China’s entire economy.

But, Houser says he started writing the book “a good year” before OpenAI’s ChatGPT became publicly accessible to the masses in 2022 — complete with a logo chillingly similar to his fictional invention.

Instead the primary stimulus to his thinking was humankind’s technological dependence during Covid — at a scale he had underestimated.

In this novel — which can occasionally be monologue-heavy — Houser imagines a super-digitized, alienating world where people choose to escape deepening political difficulties into an ever more recursive loop of social media and generative AI.

Meet Mark, CEO of Tyburn Industria, who has an ambitious dream to construct the Ark – creating a place within the immersive gaming world where people can relearn what it means to be human. It creates a world and a mission custom-built to every player’s deepest wants and desires.

But on the testing grounds, the Ark is a Pandora’s box of addiction. Some players find delight others dread. One of them even gets to talk to his dead sister.

The only way to “drift,” after all, is off-grid, where you’re hiding from a thousand algorithms by never stopping and putting your faith in the theory that paranoid maddening thoughts do not belong in your head.

From a reading standpoint, NigelDave is what a bad ChatGPT nightmare would look like.

Recent company data cited by boss Sam Altman show the tool is used by 800 million users every week and Houser says that some people are craving the validating “human veneer” created by AI.

Microsoft’s head of AI Mustafa Suleyman has spoken out the rise in AI psychosis — a non-clinical term to describe scenarios where people find themselves relying on chat bots like Claude, Grok and ChatGPT and start to believe that something imagined is real.

Last year national police chiefs spoke of the “quite terrifying” systematic misogynistic radicalisation of boys and young men. And in 2014, Facebook acknowledged that it had manipulated the news feeds of nearly 700,000 users without their knowledge or permission to control the emotional content to which they were exposed.

“As a parent, you do worry about anything that you’re introducing your kids to being something that’s either going to give them false information or is just sort of going to overwhelm with too much information,” Houser says.

But is it brave to be having a video game creator warning of these risks – bearing in mind the very long history of video games themselves being accused of breeding human violence straight out from the young?

Houser insists there’s a difference.

“It’s always been the data on the video game violence, and it just was not consistent with: ‘You play more video games, you get violent.’

AI models and social media are another thing altogether – a “new paradigm of behaviour” which gaming never came close to usurping, says Matt Navarra, the Geekout Newsletter author and former social media director.

Describing concerns as a GTA-style moral panic “understates what is changing”, he says.

“We’re thinking about external systems that can potentially shape people’s beliefs or manipulate attention, personalised experiences, nudge behaviour or even affect identity and emotional states.”