Non-playable characters, of NPCs, are characters inside games that cannot be controlled by players.
Instead of being controlled by players, they're controlled by the computer.
NPCs can either be support characters, or enemies, or neutral, and their existence have the potential to impact gameplay. But traditionally, to make NPCs work, the developer of the game created them with predetermined set of behaviors.
What this means, the so-called "AI" behind NPCs aren't really products of true artificial intelligence.
This time, Nvidia is changing that.
According to a blog post, Nvidia at Computex 2023 in Taipei, Taiwan, CEO Jensen Huang gave the world a glimpse of a world where game and a real AI collide.
In a graphically recreation of a cyberpunk-like world, inside a ramen shop, Nvidia demoed how a player can actually talk to an NPC.
Traditionally, interacting with NPCs require clicking on dialogue options. But Nvidia's iteration of the game allows players to simply hold down a button, to then say something with their own voice, and have the NPC character respond.
And the whole process happens in real time.
Nvidia calls it a "peek at the future of games."
The demo shows the player, name Kai, creates a conversation with Jin.
"Hey Jin, how are you," the person asks. "Unfortunately, not so good," replies Jin.
"How come?"
" I am worried about the crime around here. It's gotten bad lately. My ramen shop got caught in the crossfire."
"Can I help?"
"If you want to do something about this, I have heard rumors that the powerful crime lord Kumon Aoki is causing all kinds of chaos in the city. He may be the root of this violence."
"I’ll talk to him, where can I find him?"
"I have heard he hangs out in the underground fight clubs on the city’s east side. Try there."
"OK, I’ll go."
"Be careful, Kai."
Long story short, the technology is "bringing game characters to life."
The demo was built by Nvidia and partner Convai to help promote the tools that were used to create it.
More specifically, Nvidia showcased the ability of its suite of middleware called Nvidia ACE (Avatar Cloud Engine) for Games that can run both locally and in the cloud.
The Omniverse ACE for Games technology allows NPCs with no pre-written dialogue to communicate with the player in real-time, with proper voiced lines.
According to Nvidia CEO, it's "basically a large language model" that's been re-appropriated for video game integration.
Our CEO, Jensen Huang, delivered a live keynote at #COMPUTEX2023 unveiling generative #AI platforms for every industry. Catch up on new systems, software and services enabling more efficient workflows. https://t.co/8LpM1QWh2F
— NVIDIA (@nvidia) May 30, 2023
ACE for Games combines the understanding of natural language, text-to-speech, and facial animation to have NPCs listen to and interact with players in real time.
This should create a more natural interaction than gamers usually get in such a situation.
While OpenAI's ChatGPT is a more powerful generative AI in comparison, the idea is to show that with the technology, players could just speak into their headset, or microphone, and have NPC to answer in the proper context.
As Huang said, "AI will be a very big part of the future of videogames."
The entire ACE suite includes the company’s NeMo tools for deploying large language models (LLMs), Riva speech-to-text and text-to-speech, and others.
And this demo, built using Unreal Engine 5, is also packed with a multitude of ray-tracing.