Controlling robots with
your thoughts
Norwegian researchers are building
new robots that take instructions from your thoughts or by following your
motion. The results could change the way industrial robots are built and used
Researcher Angel Perez Garcia (pictured above) can make a robot move as he wants via the electrodes attached to his head. “I use the movements of my eyes, eyebrows and other parts of my face”, he says. “With my eyebrows I can select which of the bot’s joints I want to move” smiles Angel, who is a student at Norwegian University of Science and Technology (NTNU).
Facial grimaces generate major electrical activity (EEG signals) across our heads, and the same happens when Angel concentrates on a symbol, such as a flashing light, on a monitor. In both cases the electrodes read the activity in the brain. The signals are then interpreted by a processor which in turn sends a message to the robot to make it move in a pre-defined way.
A SCHOOL FOR ROBOTS
Angel Garcia is not alone in developing new ways of manoeuvring robots. Today, teaching robots dominates activity at NTNU. In the robotics hall, Signe Moe is guiding a robot by moving her arms, while researcher Ingrid Schjolberg is using a new training programme to try to get her three-fingered robot to grasp objects in new ways.
“Everyone knows about industrial robots used on production lines to pick up and assemble parts”, says Schjolberg. “They are pre-programmed and relatively inflexible, and carry out repeated and identical movements of specialised graspers adapted to the parts in question”, she says.
“We can see that industries encounter major problems every time a new part is brought in and has to be handled on the production line”, she says.
The replacement of graspers and the robot’s guidance programme is a complex process, and we want to make this simpler.
We want to be able to programme robots more intuitively and not just in the traditional way using a panel with buttons pressed by an operator.
Signe Moe’s task has thus been to find out how a robot can be trained to imitate human movements. She has solved this using a system by which she guides the robot using a Kinect camera of the type used in games technology.
“The Kinect camera has built-in algorithms which can trace the movements of my hand”, says Moe. “All we have to do is to transpose these data to define the position we want the robot to assume, and set up a system between the sensors in the camera and the robot”, she explains.
“In this way the robot receives a reference command, and an in-built regulator then computes how it can achieve the movement and how much electricity the motor requires to carry the movement out” says Moe.
Ingrid Schjolberg is demonstrating her three-fingered robotic grasper. Teaching robots new ways of grasping will greatly benefit the manufacturing industry,andthisiswhyresearchersare testing out new approaches.
“We are combining sensors in the
robotic hand with Kinect images to identify the part which has to be picked up
and handled”, says Schjolberg. “In this way the robot can teach itself the best
ways of adapting its grasping action”, she says.
“It is trying out different grips in just the same way as we humans do when picking up an unfamiliar object. We’ve developed some criteria for what defines a good and bad grip”, she explains. “The robot is testing out several different grips, and is praised or scolded for each attempt by means of a points score”, smiles Schjolberg.
“It is trying out different grips in just the same way as we humans do when picking up an unfamiliar object. We’ve developed some criteria for what defines a good and bad grip”, she explains. “The robot is testing out several different grips, and is praised or scolded for each attempt by means of a points score”, smiles Schjolberg.
MM130508
No comments:
Post a Comment