NVIDIA's CES 2023 keynote in under 10 minutes

At CES 2023, NVIDIA announced new hardware such as their updated Geforce GPUs, and a new lineup of RTX 40 series laptops, new and improved cloud based productivity software, and much more.


Like us on Facebook:

Follow us on Twitter:

Follow us on Instagram:

Follow us on TikTok:

The Engadget Podcast:

More about Engadget Audio:

Read more:

Video transcript


JEFF FISHER: Hi, everyone. Happy New Year. And welcome to CES 2023. AI will define the future of computing. And this has influenced much of what we are covering today. Last fall, we introduced a quantum leap in PC gaming, our Ada Lovelace architecture. Every RTX GPU comes bundled with a supercomputer, training AI networks to generate high-resolution frames from a lower-resolution base. The AI inference is then done in real time on RTX tensor cores.

This is DLSS. With Ada, we introduced the next breakthrough in AI-powered graphics, DLSS 3. DLSS 3 uses AI to generate entirely new frames outside of the graphics pipeline. I'm excited to announce that "Witchfire" will feature the AI performance boost of DLSS 3 when it launches in early 2023. "The Day Before" will launch March 1 featuring ray tracing and DLSS 3. "War Haven" will be accelerated by NVIDIA DLSS 3 when it launches later this year.

"Throne and Liberty" is a brand new IP from NC Soft. Coming in 2023, RTX gamers will jump into boss raids with a performance multiplier of DLSS 3. "Atomic Heart" is one of the most anticipated games of 2023 where you take part in explosive encounters in a mad and sublime utopian world. Releasing February 21, RTX gamers will explore this twisted sci-fi world with DLSS 3.

We built GeForce Now to make high performance gaming accessible to billions of gamers. Subscribing to GFN puts a GeForce RTX GPU in any client, including Chromebooks, phones, low-end laptops, even MacBooks. I'm excited to announce that the Ada Lovelace architecture is coming to GeForce Now. The new RTX 4080 super pods will deliver an amazing 64 teraflops of graphics goodness to each gamer. For competitive gamers, we're also bringing Nvidia reflex to GFN.

The RTX 4080 super pods can render and stream at 240 frames per second. The RTX 4080 will be available in our new ultimate membership. All RTX 3080 members will receive the ultimate upgrade. The price to existing 3080 members will remain the same.

Today, I'm excited to announce our next GPU in the family, the GeForce RTX 4070 Ti. The RTX 4070 Ti is packed with 40 teraflops of Ada shader cores, 93 teraflops of third generation RT cores, and 641 teraflops of fourth generation tensor cores. The RTX 4070 Ti will be available on January 5 with a starting price of 799.

Today, we are announcing that GeForce RTX 40 series laptops. They are three times more power efficient and bring the Ada architecture, DLSS 3, and new MAX-Q technologies to the next generation of laptops. Today, we are introducing the new RTX 4070, 4060, and 4050 laptops. 40 series laptops start at just 999 and will be available on February 22.

The ADA architecture has also enabled a new class of enthusiast laptops. I'm excited to introduce today the new RTX 4090 and 4080 flagship laptops. They start at 1999 and will be available on February 8.

STEPHANIE JOHNSON: Gamers demand the absolute best performance. But they are not alone in those needs. With over 110 million professional and hobbyist PC creators, the market continues to grow fueled by powerful technology. Nvidia Studio is our platform for this new breed of content creators, supercharging workflows with RTX GPUs.

The heartbeat of the studio platform is found in Nvidia Omniverse, where creators can connect those accelerated apps, and collaborate in real time. Nvidia Omniverse is a collaboration platform enabling artists to connect their favorite tools, from Adobe, Autodesk, Unreal Engine, Side Effects, and more. Creators see the aggregated scene instantly come together without lengthy important export cycles. Changes happen in real time across the connected apps.

Nvidia Canvas allows creators to paint with materials using simple brushstrokes and AI to quickly conceptualize a beautiful image. With the new Canvas 360 feature, artists can create panoramic scenes and export them into any 3D app or platform as an environment map. These can be created and iterated upon quickly to change background and lighting in a 3D scene.

The Canvas 360 beta will be available to download for RTX users later this quarter. Nvidia broadcast enhances your mic and webcam. With the latest update, we are adding eye contact to the feature list. Eye contact will change the position of the speaker's eyes to appear as if you're looking at the camera, allowing for better audience engagement.

- Much of this presentation will focus on Isaac Sim, and how innovators in the robotics industry are successfully using Isaac Sim to accelerate their development. All too often, 3D assets are built for visualization, and not physically accurate simulation. Isaac Sim-ready assets address this problem. Replicator is the tool for synthetic data generation. Pre-trained models and the TAO toolkit provide a great starting point to rapidly train, adapt, and optimize AI models.

Nvidia Core is used for layout planning and real time route optimization. Finally, for the second category of runtime on physical robots, we are working with the Ross community to accelerate Isaac Ross. Recently, we enabled Isaac Sim in the cloud, any cloud, your choice. A developer is no longer limited by the capability of their PC or workstation. They can access all the features of Isaac Sim with a single click to the cloud from a browser.

The ability to simulate tens, hundreds, if not thousands of robots in parallel, is one of the unequivocal benefits of simulation. Engineers can spawn multiple instances of a virtual robot in the cloud. Soon, Isaac Sim will also address simulation of multiple robots in a single instance. Today, we are announcing the next release of Isaac Sim.

The new features include improved camera and LiDAR support to more accurately model real-world performance, a new conveyor building tool, a new utility to add people to the simulation environment, a collection of new simulated warehouse assets and a host of new popular robots that come pre-integrated.

For robotics researchers, we're introducing a new tool called Orbit, which provides operating environments for manipulated robots. We've also improved Isaac Jim for reinforcement learning and updated cortex for collaborative robot programming.

ALI KANI: Developing self-driving cars is one of the most complex AI challenges of our time. It requires two computers, an a factory in the data center that is used for software development and testing, and there's an AI computer in the car. OEMs need to process massive amounts of data collected from their fleet, and then curate, label, and train their AI self-driving software models. Using Nvidia's DRIVE sim, you can then test and validate this self-driving software in the digital twin of these cars on millions of scenarios every day.

Now in the vehicle computer, NVIDIA DRIVE provides a suite of full stack, self-driving, and cockpit application software. This includes the operating system, middleware, parking, self-driving, and various in-vehicle cockpit applications. The NVIDIA DRIVE platform is designed to simplify and centralize the architecture for software-defined vehicles, enabling a leap in performance and capability while reducing energy consumption and cost.

Today, I'm excited to announce our partnership with Foxconn, the world's largest technology manufacturer and service provider. Today, we're announcing that GFN is coming to the screens in your car. Powered by gaming supercomputers in the cloud, GeForce now connects to the world's biggest digital game stores. So users can stream across any device, including internet-enabled cars. No special equipment required.

Today, I'm excited to announce that several DRIVE partners are integrating GeForce Now. DRIVE Sim, built on Omniverse, enables us to create a digital twin of our roadways to ensure that the AI car can safely navigate through complex scenarios. Today, I'm announcing that Mercedes-Benz is using this same digital twin technology to plan and build more efficient production facilities.

Using NVIDIA DRIVE Sim, automakers can design their vehicles and retail experiences entirely in the virtual world, streamlining a traditionally lengthy process. DRIVE Sim can also be used to validate that the vehicle design meets local safety standards. Potential car buyers can also benefit from the simulation platform, configuring and experiencing the car from the comfort of their homes.