NVIDIA’s Isaac SDK and Isaac Simulator, which were announced last month, are now available for robotics developers to download.
The Isaac SDK toolbox offers developers access to Isaac applications, GEMs (robot capabilities), a Robot Engine, and the Isaac Sim. NVIDIA’s goal with its Isaac portfolio is to make it easier for manufacturers, researchers, and startups to add AI for perception, navigation, and manipulation into next-generation robots.
At the core is Isaac Robot Engine, which enables developers to build modular robotics applications that can be deployed to Jetson platforms, including NVIDIA‘s robot reference designs.
Deepu Talla, NVIDIA’s VP and VP of Autonomous Machines, is delivering the opening keynote of the Robotics Summit & Expo (June 5-6 in Boston) called “Bringing AI-Powered Robots to Life.” During the keynote presentation, Talla will dive into the elements required to bring a product to life — embedded hardware, flexible software, and a strong ecosystem. He will also review some of the latest advancements in robotics research and provide examples of commercially successful robotics systems.
Isaac Sim is a virtual robotics lab and 3D simulator. Developers can use virtual robots with simulated sensors (RGB, stereo, depth, segmentation, LIDAR, IMU) in Isaac Sim to test applications in a high-fidelity simulation environment. The entire Isaac ecosystem plays nicely with other NVIDIA product lines, of course. So once tested in Isaac Sim, applications can be deployed to NVIDIA Jetson AGX Xavier, Jetson TX2, or Jetson Nano running on physical robots.
Isaac also features a couple robotic platforms. Carter is an autonomous mobile robot designed for more experienced robotics developers. NVIDIA says the Isaac SDK can help developers with localization, autonomous navigation, map edition, path planning and state machines. Kaya is a mobile platform that appears to target more entry-level developers. It can be built using off-the-shelf components and 3D-printed parts.
NVIDIA has made a slew of robotics-related announcements in recent months. In March 2019, NVIDIA Jetson platform added support for AWS RoboMaker, a cloud robotics service from Amazon Web Services. RoboMaker, which runs on top of the Robot Operating System (ROS), can be used to build robots, add intelligent functions, simulate and test robots in a variety of environments, and manage and update robot fleets.
Roger Barga, General Manager, AWS Robotics and Autonomous Services, Amazon Web Services, is also keynoting the Robotics Summit. His presentation, “The Role of the Cloud in the Future of Robotics,” will discuss the role cloud services will play in the future of robotics, allowing developers to partition functionality between their physical robot and the cloud, in particular compute intensive functionality such as machine learning and artificial intelligence. He will also describe how artificial intelligence can enhance collaborative, consumer and personal assistant robot capabilities, with examples including face and object recognition, as well as voice command and response.
In conjunction with the Robotics Summit & Expo, AWS will be holding a pre-event RoboMaker Immersion Day on June 4. During the RoboMaker Immersion Day workshop, attendees will learn how to use AWS RoboMaker to easily develop, test, and deploy intelligent robotics applications. Registration for the RoboMaker Immersion Day is free.
In January, NVIDIA opened its first full-blown robotics research lab. Located in Seattle just a short walk from the University of Washington, NVIDIA’s robotics lab is tasked with driving breakthrough research to enable next-generation collaborative robots that operate robustly and safely among people. NVIDIA’s robotics lab is led by Dieter Fox, senior director of robotics research at NVIDIA and professor in the UW Paul G. Allen School of Computer Science and Engineering.
Tell Us What You Think!