Nvidia reportedly plans to release its Jetson Thor computing platform in 2025, aiming to provide advanced AI capabilities for humanoid robots. Initially presented by CEO Jensen Huang during an annual company event in March 2024, Jetson Thor is designed to enhance robot autonomy and interactions through compact, adaptable technology. This move places Nvidia in a competitive position as the industry accelerates developments in robotics and AI.
Technical Specifications of Jetson Thor
Jetson Thor represents a significant upgrade within Nvidia’s lineup. The robotics module will reportedly integrate a CPU cluster comprising 20 to 30 ARM64 cores, an increase from its Jetson AGX Orin’s 12-core setup.
The standout capability lies in its computing power: up to 800 teraflops (TFLOPS) of eight-bit floating-point performance, facilitated by Nvidia’s new Blackwell graphics architecture and transformer engine. Such power equips Jetson Thor to handle complex AI operations essential for real-time robotic tasks.
A central component of Jetson Thor is its 128GB shared memory, mirroring the unified memory architecture found in modern high-performance computing. This extensive memory pool supports applications requiring substantial data throughput, including large language models (LLMs) that demand high processing and memory efficiency.
The module’s connectivity is equally robust, with up to 100GB Ethernet bandwidth. This high data transfer rate is critical for maintaining seamless communication within multi-functional robotic systems.
A built-in functional safety coprocessor sets Jetson Thor apart, designed to ensure steady operations in critical environments where safety is non-negotiable. This feature reinforces the module’s use in applications where continuous performance is vital, contributing to its suitability for complex AI-driven tasks.
Applications and Industry Integration
Jetson Thor is positioned as an optimized module for high-level robotics, particularly in the domain of embodied AI—systems that interact dynamically with their environment. A focal point for Nvidia is Project GR00T, an AI initiative aimed at developing humanoid robots capable of natural language comprehension, observational learning, and autonomous execution of intricate tasks.
The module also benefits from integration within Nvidia’s Isaac robotics platform, which supports GPU-accelerated reinforcement learning through Isaac Lab and comes with pre-trained models aimed at optimizing robotic functions. These features are designed to accelerate development cycles and enhance robotic performance, from learning movements to task precision.
Industry Collaborations and Market Outlook
Nvidia’s Jetson Thor has garnered attention from established players in the robotics field, including Tesla, Siemens, and Boston Dynamics, signaling widespread interest in this new module. Partnerships with companies like Unitree Robotics and XPENG Robotics underscore the module’s potential to impact various sectors, including healthcare, manufacturing, and customer-facing services.
Forecasts indicate rapid growth in the humanoid robotics market, projected to expand from $3.28 billion in 2024 to an estimated $66 billion by 2032. This trajectory points to a rising demand for automation and efficient operational technology, areas where Jetson Thor’s specifications align well.
Tesla’s Optimus Robot
While Nvidia’s platform focuses on being adaptable for a variety of manufacturers, Tesla has demonstrated its own integrated approach with the Optimus robot. Introduced in 2021, Optimus has showcased its progress by exploring unseen spaces autonomously, avoiding people & obstacles using neural nets.
The robot’s adaptability was highlighted when it adjusted its task after workers changed the setup, demonstrating the benefits of end-to-end neural training. Tesla’s holistic strategy contrasts with Nvidia’s model of providing key technology for other OEMs.
This reflects a shift in how tech giants approach robotics. Nvidia’s approach aligns with the role of chip providers in mobile tech, offering hardware that OEMs can use to power AI applications across different industries. Its partnerships with firms like Siemens and Universal Robots underscore its intent to be the backbone of diverse robotics solutions.
Amazon and Apple
Amazon continues to invest in robotics through acquisitions like Covariant, a startup specializing in AI-driven automation. Covariant’s “Brain” software, which facilitates complex tasks in warehouses, will bolster Amazon’s fulfillment capabilities. This acquisition follows the company’s recruitment of founders from Adept AI earlier in 2024, reinforcing its focus on integrating advanced AI into logistics. In 2022, the company announced to invest $1 billion into improving robots and other tech in its locations.
Meanwhile, Apple’s robotics plans are centered around a tabletop device set for release in 2026. Reports from August detailed that this smart assistant will have an iPad-like screen with 360-degree mobility, allowing it to function as a home hub and video conferencing tool. Production will involve Foxconn, which has expertise in robotic component manufacturing.
Meta and OpenAI
Meta has also emerged as a player in the robotics arena, focusing on tactile AI technology. In October it revealed the Digit 360 sensor, developed with GelSight Inc., capable of mimicking human touch. This device, expected to be part of Meta’s Allegro Hand in 2025, detects pressure and vibration to improve how robots handle objects. These advancements aim to replicate human-like touch sensitivity, opening new possibilities for applications like robotic surgery.
OpenAI, known for its software-based AI, has signaled a shift toward physical technology by hiring Caitlin Kalinowski, who previously led AR hardware projects at Meta. The company’s ongoing work with Jony Ive’s design firm, LoveFrom, indicates plans to introduce a device that moves beyond traditional tech interfaces, hinting at an AI-first product that could redefine user interaction.