Facebook recently launched Habitat 2.0, a simulation platform for embodied AI research. The platform has been open-sourced under the MIT license. 

Compared to the previous version, the new simulation platform offers faster speeds and interactivity. AI agents can efficiently perform the equivalent of many years of real-world tasks (a billion or more frames of experience) like opening and closing doors and drawers, picking up items, etc. 

(Source: Facebook AI)

Comparison

Simulation platforms are typically used to train robots to accomplish tasks in the real world. For example, it allows AI researchers to teach machines to navigate through photo-realistic 3D virtual environments and help to interact with objects just as they would in the real world. Besides Habitat, other robotic simulation platforms include NVIDIA ISAAC and Unity.

The second-generation simulation platform Habitat 2.0 expands on the capabilities of the Habitat-Sim simulation engine by supporting piecewise-rigid objects such as drawers and cabinets; articulated robots that include mobile manipulators like Fetch, fixed base arms like Franka, and quadrupeds like AlienGo; and rigid-body physics (via Bullet physics engine). 

Habitat 2.0 prioritises speed and performance over a wide range of simulation capabilities and allows researchers to test new approaches and iterate more effectively. For example, the researchers have used a navigation mesh to move the robot instead of simulating wheel-ground contact. 

However, the platform does not support non-rigid dynamics such as liquids, films, clothes, and ropes, as well as tactile sensing or audio. This, in a way, makes the Habitat 2.0 simulator two orders of magnitude faster than most 3D simulators available to academics and industry professionals. With these capabilities, researchers can now use the platform to perform complex tasks, such as clearing up the kitchen or setting the table. 

For instance, the platform can simulate a Fetch robot interacting in ReplicaCAD (a 3D dataset) scenes at 1,200 SPS (steps per second) while the existing platform runs at 10 to 400 SPS. It also scales well, achieving 8,200 SPS (273x real-time) multi-process on a single GPU and nearly 26,000 SPS (850x real-time) on a single node with 8 GPUs. 

Facebook researchers said such speeds significantly cut down on experimentation time, from six months to as little as two days. 

New benchmarks 

Currently, Habitat 2.0 includes a new fully interactive 3D dataset (ReplicaCAD) of indoor spaces and new benchmarks for training virtual robots in complex physics-enabled scenarios. 

With this, AI researchers can develop virtual agents in static 3D environments and move closer to creating robots that can quickly and reliably perform tasks like loading dishwashers, stocking fridges, fetching objects on command, and more.  

The open-source simulation platform Habitat 2.0’s new dataset, ReplicaCAD, supports the movement and manipulation of objects. It is a mirror of Replica, a dataset previously released for 3D environments. In ReplicaCAD, the static 3D scans have been converted to individual 3D models with physical parameters, semantic annotations and collision proxy shapes that can enable training for movement and manipulation for the first time. 

ReplicaCAD features close to 111 unique layouts of living space and 92 objects. The objects and living space layouts were created with the consent of and compensation to artists and shared under a ‘creative commons license’ for non-commercial use with attribution (CC-BY-NC). 

The interactive recreations also incorporated size and friction, whether an object (such as a door or refrigerator) has compartments that could close or open and how those mechanisms worked. Besides this, the researchers created several possible variations of each scene and developed a pipeline for introducing realistic clutter such as books, kitchen utensils, and furniture. 

In terms of benchmarks, the ReplicaCAD dataset and Habitat 2.0 simulator made it possible to create a new library of ‘household assistive tasks’ called home assistive benchmark (HAB). HAB includes general tasks like cleaning the fridge, setting the table, cleaning the house, and robot skills like pick, place, open cabinet drawer, open fridge door, etc., along with configuration around everyday household errands. 

Moreover, HAB requires robots not to assume any prior knowledge about the environment and operate exclusively from onboard sensors such as egomotion, joint-position sensors and RGBD cameras. 

Road ahead

Soon, Facebook looks to develop models around living spaces in more places across the world, enabling more varied training that considers cultural-and region-specific layouts of furniture, types of furniture, and different objects.

The researchers are speeding up Habitat 2.0 even more by addressing potential bottlenecks, such as handling synchronised parallel environments and the need to reload assets when an episode resets. The team believes that reorganising the rendering + physics + reinforcement learning interplay would be exciting for future work. “We hope that the ability to perform more complex tasks in simulation will bring us closer to the AI that can help make our everyday lives easier and better,” said Batra.  

The post Facebook Launches Open Source Simulation Platform Habitat 2.0 appeared first on Analytics India Magazine.