Strict Standards: Only variables should be assigned by reference in /home/noahjames7/public_html/modules/mod_flexi_customcode/tmpl/default.php on line 24

Strict Standards: Non-static method modFlexiCustomCode::parsePHPviaFile() should not be called statically in /home/noahjames7/public_html/modules/mod_flexi_customcode/tmpl/default.php on line 54

Strict Standards: Only variables should be assigned by reference in /home/noahjames7/public_html/components/com_grid/GridBuilder.php on line 29

Share This article

Character animation is hard at the best of times. Hollywood and games-industry animators long ago turned away from traditional “keyframe” animation for creating large quantities of realistic character movement, opting instead to motion capture such movement directly from human actors. The challenge is even greater for roboticists, whose output must not only look natural but stay standing while doing so. Robot movement is a constant struggle between aesthetics and practicality, speed and balance, and specifically and intentionally coding every little aspect of robotic locomotion is taking a really long time to give us an all-purpose bipedal robot. So, why not turn to motion capture once again?

Now, MIT researchers have a bid to do just that, with a robot called HERMES. What sets HERMES apart from other bipedal robots is that its movement is controlled not by an AI program but directly, by a human pilot. It calls out to recent movies like Pacific Rim, as well as just about every manga ever: at MIT, human beings are “driving” robots with their physical arms and legs. What sets this idea apart from a simple piloting scheme is that not only does the pilot’s control input get translated into movement by the robot but the robot’s movements and internal forces are fed back to the pilot through a suit of small actuators.

The feedback mechanism lets the pilot feel large-scale forces acting on the robot, so they can use their natural human reflexes to offset these forces and maintain balance. The primary demonstration is one of punching through a wall; without the human feedback control, the robot can punch through the wall, but then falls forward due to the violent shift in its weight, while the human pilot can push against this momentum with leg and back muscles, keeping themselves and the robot upright. A pair of VR goggles and a camera mounted on the robot’s head lets pilots see their movements from HERMES’ perspective, and keep them all oriented correctly. The robot always had the physical ability to stay balanced, but an inadequate set of instructions about how to use those abilities to actually do so in the real world.

MIT researchers develop robot that can learn human reflexes

PhD student Joao Ramos demonstrates the Balance Feedback Interface, a system that enables an operator to control the balance and movements of a robot, through an exoskeleton and motorized platform. Photo: Melanie Gonick/MIT

Unlike animated characters, however, robots need to be able to learn the rules for movement, rather than the specific movements themselves. HERMES isn’t just mimicking the movements, or recording them to be replayed later, but taking note of which compensatory “muscle” movements were needed to offset which situations. It’s the sort of data-set that will be needed to build much more robust robotic movement suites, letting captured information inform and adjust movement algorithms. to learn how much grip pressure should be applied which faced which how much resistance to grip pressure. Once they’ve collected enough information, the researchers say they want to start integrating human control with true autonomy — presumably with the goal of someday phasing out the human altogether.

This is an innovative approach to robot movement. It’s something that hasn’t been tried by the likes of Honda, with its famous Asimo robot, nor by Boston Dynamics with ATLAS. The robot can take the most useful elements of human instinct, while applying those insights with inhuman strength and dexterity; the robot might only be able to safely punch through a wall thanks to the human pilot, but that pilot probably couldn’t punch through the wall with only their own physical strength.

What this essentially allows is for a human to do in real time, with natural physical instincts, what a programmer would otherwise have to do artificially, over many iterations of an experiment. A human being can dynamically adjust grip strength in response to feedback, applying the perfect level of tension without having to have picked up that object ever before. With an accurate enough accounting of human movement in response to robot sensation, robots could acquire that sort of versatility, too.

Read more http://www.extremetech.com/extreme/212010-when-people-can-feel-robot-movement-they-can-teach-human-reflexes


Strict Standards: Only variables should be assigned by reference in /home/noahjames7/public_html/modules/mod_flexi_customcode/tmpl/default.php on line 24

Strict Standards: Non-static method modFlexiCustomCode::parsePHPviaFile() should not be called statically in /home/noahjames7/public_html/modules/mod_flexi_customcode/tmpl/default.php on line 54

Find out more by searching for it!

Custom Search







Strict Standards: Non-static method modBtFloaterHelper::fetchHead() should not be called statically in /home/noahjames7/public_html/modules/mod_bt_floater/mod_bt_floater.php on line 21