Nvidia RTX AI PCs and generative AI for games — how the Blackwell GPUs and RTX 50-series aim to change the way we work and play

2 days ago 6
Nvidia RTX AI PCs and Generative AI
(Image credit: Nvidia)

Nvidia has a lot to say about artificial intelligence and machine learning. It's not all data center hardware and software either, though Nvidia's various supercomputers are doing most of the heavy lifting for training new AI models. At its Editors' Day earlier this month, Nvidia briefed us on a host of upcoming technologies and hardware, including the Blackwell RTX architecture, neural rendering, the GeForce RTX 50-series Founders Edition cards, Blackwell for professionals and creators, and Blackwell benchmarking. There were two sessions devoted to generative AI and Nvidia's RTX AI PC ecosystem, which we'll discuss here.

Generative AI came to the forefront with the rise of tools like Stable Diffusion and ChatGPT over the past couple of years. Nvidia has been working on AI tools for a while now that are designed to change the way games and NPCs behave and the way we interact with them. We've heard about ACE (Avatar Cloud Engine) for a while now, and it continues to improve. With Blackwell and upcoming games, Nvidia has partnered with various game developers and publishers to leverage ACE and related technologies. The results range from interesting to pretty bad, so we'll just let these videos speak for themselves and provide more analysis below. 

NVIDIA ACE | ZooPunk - TiGames Partner Spotlight - New Dimensions for In-Game Customizations - YouTube NVIDIA ACE | ZooPunk - TiGames Partner Spotlight - New Dimensions for In-Game Customizations - YouTube

Watch On

Black State | 4K RTX Showcase - Captured on GeForce RTX 5090 (Extended Cut) - YouTube Black State | 4K RTX Showcase - Captured on GeForce RTX 5090 (Extended Cut) - YouTube

Watch On

NVIDIA ACE | inZOI - Create Simulated Cities with Co-Playable Characters - YouTube NVIDIA ACE | inZOI - Create Simulated Cities with Co-Playable Characters - YouTube

Watch On

NVIDIA ACE | Introducing PUBG Ally - First Co-Playable Character - YouTube NVIDIA ACE | Introducing PUBG Ally - First Co-Playable Character - YouTube

Watch On

NVIDIA ACE | MIR5 - Wemade Introduces the First AI Boss - YouTube NVIDIA ACE | MIR5 - Wemade Introduces the First AI Boss - YouTube

Watch On

Game-Changing Livestreaming With GeForce RTX 50 Series - YouTube Game-Changing Livestreaming With GeForce RTX 50 Series - YouTube

Watch On

One of the key issues, as with so many things related to generative AI, is getting the desired results. Live demos of PUBG Ally had Krafton representatives talking to the AI player, who would respond verbally as expected. "Go find me a rifle and bring it to me." "Okay, I'll go do that..." At this point, the AI NPC would seemingly do nothing of the sort. It seemed caught in a loop and still looks far away from being ready for public consumption. But these things change fast, so perhaps it's really only a few months away from being great — who knows? (We also question how having AI NPCs in a multiplayer game will work out, but that's a different subject.)

Other use cases demonstrated include a raid boss in MIR5 that will supposedly learn from past encounters and adapt over time, requiring different tactics to repeatedly defeat the boss. The high-level concept sounds a bit like the Omnidroid from the Incredibles, learning and becoming more powerful over time, though the raid boss won't gain new abilities so it shouldn't become invincible — because where's the fun in that?

Zoopunk allows the player to repaint and decorate a spaceship by interacting with an AI. Again, the live demo was lacking, as there were lengthy pauses before a response, with a simple prompt like "Please paint my ship purple," resulting in a 20 to 30-second sequence that felt entirely unnecessary.

Fundamentally, the problem isn't just about using ACE and AI to create NPCs for games; it's about making those NPCs actually useful, interesting, and fun. These are games, after all, and if we're only adding voice interactions that aren't actually meaningful, what's the point? We're still waiting to see a demo of a game where the AI NPCs make for a better end result than traditional game development, but we're sure there are bean counters looking for ways to cut costs.

Nvidia Blackwell AI overview
(Image credit: Nvidia)

The other session, RTX AI PCs, was related to the ACE stuff with more of a focus on all the various tools Nvidia has created rather than on actual game demos. There are lots of new RTX NIMs — Nvidia Inference Microservices — coming, with blueprints (code samples, basically) of how they can be used.

One of the more interesting examples was a blueprint for converting a PDF into an AI-voiced podcast. The tool had several components for extracting text, analyzing images, analyzing tables (all stored in a PDF), and then creating a script for a podcast. The result still sounds very much like an AI reading a script, but you can fully edit the resulting text, and you could even do the voiceover yourself, which would bring some humanity into the equation.

You can see the full slide deck below, and developers interested in these tools will want to get in touch with Nvidia to register for and use the various APIs.

Nvidia Blackwell AI overview
(Image credit: Nvidia)

Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

Read Entire Article