For decades, video games have relied on scripted, stilted interactions with non-player characters to guide players on their journeys. But Artificial intelligence technology improves, Game studios experiment with Generative AI to help build environments, assist game designers in crafting NPC dialogue, and give video games the improvisational spontaneity once reserved only for tabletop role-playing games.
In the multiplayer game Retail Mage, players help run a magical furniture store and assist customers in the hopes of earning a five-star review. As a salesperson – and wizard – they can pick up and examine items or tell the system what they want to do with a product, like disassembling chairs into pieces or ripping a page out of a book to write a message to a buyer.
A player’s interactions with the shop and the NPCs surrounding them – from game mechanics to content creation and dialogue – are driven by AI rather than a predefined script to create more options for chatting and using items in the shop.
“We believe generative AI can enable a new kind of gameplay where the world is more responsive and better able to respond to players’ creativity and the things they think up and the stories they want to tell in a fantasy environment that we create for them,” said Michael Yichao, co-founder of Jam & Tea Studios, which developed “Retail Mage.”
The typical NPC experience often leaves a lot to be desired. Canned interactions with someone tasked with passing on a quest usually involve a handful of chat options that lead to the same result: players get the information they need and can move on. Game developers and AI companies say they want to create a richer experience by using generative AI technology that allows for more nuanced relationships with the people and worlds the designers create.
Generative AI could also give players more opportunities to deviate from the script and create their own stories if designers can create environments that feel more alive and can respond to players’ choices in real time.
Technology companies continue to develop AI for gaming, even as developers debate how and whether they will use AI in their products. Nvidia has developed its ACE Technologies to bring so-called “digital humans” to life with generative AI. Inworld AI provides developers with a platform for generative NPC behavior and dialogue. Gaming company Ubisoft announced last year that it was using Ghostwriter, an internal AI tool, to help write some NPC dialogue without replacing the video game writer.
A report by the Game Developer Conference In January, a study found that nearly half of developers surveyed said their workplace currently uses generative AI tools, with 31% saying they personally use these tools. Developers at indie studios were the most likely to use generative AI: 37% said they use the technology.
Still, about four in five developers said they were concerned about the ethical use of AI. Carl Kwoh, CEO of Jam & Tea, said AI should be used responsibly alongside developers to enhance stories – not replace them.
“That’s always been the goal: How can we use this tool to create an experience that connects players more?” said Kwoh, who is also one of the company’s founders. “They can tell stories that they couldn’t tell before.”
Using AI to tell NPCs endless things is “definitely an advantage,” Yichao said, but “content without meaning is just endless noise.” That’s why Jam & Tea uses AI – via Google’s Gemma 2 and their own servers in Amazon — to give NPCs the ability to do more than just react, he said. They can search for objects while shopping or react to other NPCs to add “more life and responsiveness than a typical scripted encounter.”
“I watched players turn our shopping experience into a kind of dating simulation, flirting with customers and then having the NPCs give very realistic responses,” he said. “It was really fun to see how the game dynamically responded to what players had to offer.”
Ike Nnole demonstrated a conversation with an NPC in the game “Mecha BREAK,” in which players fight against war machines, and said: NVIDIA has made its AI “humans” more responsive than before by using small language models. Using Nvidia’s AI, players can interact with mechanic Martel by asking her to do things like adjust the color of a mech machine.
“Normally, a player would have to go through menus to do this,” said Nnole, senior product marketing manager at Nvidia. “Now it could be a much more interactive and faster experience.”
Artificial Agency, a Canadian AI company, has developed an engine that allows developers to integrate AI into every part of their game – not just NPCs, but also companions and “overseer agents” that can guide a player to content they’re missing. The AI can also create tutorials to teach players missing skills, so they can have more fun in the game, according to the company.
“We like to put it this way: we’re giving a game designer a pat on the back to everyone who plays the game,” said Alex Kearney, co-founder of Artificial Agency. The company’s AI engine can be integrated into any stage of the game development cycle, she said.
Brian Tanner, CEO of Artificial Agency, said scripting every possible outcome of a game can be tedious and difficult. Their system allows designers to act more like directors by telling characters more about their motivations and backgrounds, he said.
“These characters can improvise on the fly depending on what’s happening in the game,” Tanner said.
It’s easy to reach the limits of a game, Tanner said, where NPCs repeat the same phrase over and over, regardless of how players interact with them. But as AI advances, that will change, he added.
“It will really feel like the world is alive and like everything is reacting to what is happening,” he said. “That will add tremendous realism.”
Datasheet: Stay on top of the technology business with in-depth analysis from the biggest names in the industry.
Register here.