There are many reasons why we don't see many robots in our homes, other than robot vacuums, the most important of which is the challenge of unstructured and semi-structured environments. From floor plans to lighting, surfaces, and even people and pets, no two homes are the same. Even if robots could effectively map each home, the space is in constant flux.
This week, researchers at MIT CSAIL announced a new way to train domestic robots in simulation: You can use your iPhone to scan parts of your home and upload them into the simulation.
Simulation has been a fundamental element of robot training for decades, allowing a robot to attempt and fail a task thousands, or even millions, of times in the same amount of time it would take to do it once in the real world.
The impact of failure in a simulation is also significantly smaller than in reality: Imagine if you had to teach a robot to put a mug in the dishwasher, breaking 100 real mugs in the process.
“Training in a simulated virtual world is very powerful because the robot can practice millions of times,” researcher Pulkit Agrawal says in a video accompanying the research. “If it breaks thousands of plates, it doesn't matter because it's all done in the virtual world.”
But like the robots themselves, simulations have limitations when it comes to dynamic environments like the home. Making simulations as easy to access as an iPhone scan could greatly improve a robot's adaptability to different environments.
In fact, creating a sufficiently robust database of such an environment will allow your system to be more adaptable when something inevitably gets out of place, like moving furniture or leaving dishes on the kitchen counter.