Opdragelse af robotter

I Skat vi skal huske at lære robotten at tale fortalte jeg om at forskere havde skabt en lille robot, iCub, der efter ideen skulle være i stand til at lære sprog gennem en læringsproces, ligesom mennesker gør det; nemlig ved at aktivt deltagelse i brugen af sproget. Jeg må indrømme jeg ikke undersøgte sagen meget nærmere end som så, men der er faktisk mere interessant ved denne lille robot.

Den skal nemlig ikke bare kunne lære sprog. Den skal lære at bevæge sig, lære at gribe om ting. Og ligefrem lærer at interagere. Og et af de store mål ved at skabe denne lille robot er, at forstå den funktion i vores egen hjerner som kaldes ‘spejlneuroner’, hvilket vil sige hvordan vi gennem observationer og erfaring lærer at forstå ting og udvikle vores egne evner. Seks forskellige forskningscentre har fået ansvaret for at ‘opdrage’ og udvikle denne lille robot’s kognition og derved tænkning:

The team behind the iCub robot believes it, like children, will learn best from its own experiences.

The technologies developed on the iCub platform – such as grasping, locomotion, interaction, and even language-action association – are of great relevance to further advances in the field of industrial service robotics.

The EU-funded RobotCub project, which designed the iCub, will send one each to six European research labs. Each of the labs proposed winning projects to help train the robots to learn about their surroundings – just as a child would.

The six projects include one from Imperial College London that will explore how ‘mirror neurons’ found in the human brain can be translated into a digital application. ‘Mirror neurons’, discovered in the early 1990s, trigger memories of previous experiences when humans are trying to understand the physical actions of others. A separate team at UPF Barcelona will also work on iCub’s ‘cognitive architecture’.

At the same time, a team headquartered at UPMC in Paris will explore the dynamics needed to achieve full body control for iCub. Meanwhile, researchers at TUM Munich will work on the development of iCub’s manipulation skills. A project team from the University of Lyons will explore internal simulation techniques – something our brains do when planning actions or trying to understand the actions of others.

Over in Turkey, a team based at METU in Ankara will focus almost exclusively on language acquisition and the iCub’s ability to link objects with verbal utterances.

 Processen skal foregå på en tilsvarende måde som vi mennesker gennemgår igennem vores udvikling og opdragelse og således skal foregå ved at robotten observere eller ‘sanser’ andre gribe noget, snakke, gå eller interagere og derved lærer hvordan den selv skal gøre disse forskellige ting.

The iCub robots are about the size of three-year-old children, with highly dexterous hands and fully articulated heads and eyes. They have hearing and touch capabilities and are designed to be able to crawl on all fours and to sit up.

Humans develop their abilities to understand and interact with the world around them through their experiences. As small children, we learn by doing and we understand the actions of others by comparing their actions to our previous experience.

The developers of iCub want to develop their robots’ cognitive capabilities by mimicking that process. Researchers from the EU-funded Robotcub project designed the iCub’s hardware and software using a modular system. The design increases the efficiency of the robot, and also allows researcher to more easily update individual components. The modular design also allows large numbers of researchers to work independently on separate aspects of the robot.

Og ligesom det er små skridt mennesker tager i udviklingen gennem opvæksten er det ligeledes små skridt forskerne anticipere for iCub

But the first and key skill iCub needs for learning by doing is an ability to reach towards a fixed point. By October this year, the iCub developers plan to develop the robot so it is able to analyse the information it receives via its vision and feel ‘senses’. The robot will then be able to use this information to perform at least some crude grasping behaviour – reaching outwards and closing its fingers around an object.

“Grasping is the first step in developing cognition as it is required to learn how to use tools and to understand that if you interact with an object it has consequences,” says Giorgio Metta. “From there the robot can develop more complex behaviours as it learns that particular objects are best manipulated in certain ways.”

Ved at udvikle disse systemer gør det det ikke bare muligt at skabe robotter der ville kunne være til stor nytte i samfundet. I Korea og Japan er de første robotter allerede i brug i sundheds- og ældreplejen, og for den sags skyld i mange forskellige industrier. Men det vil også være til stor nytte for at forstå vores egen bevidsthed, vores tænkning, og hvad det er der foregår i de forskellige processer i hjernen. Skræmmescenarierne for udviklingen af kunstig intelligens ville måske være Hal 9000 fra “Rumrejsen år 2001” eller måske Marvin, the Paranoid Android fra The Hitchhiker’s Guide to the Galaxy.

Kilde:
ICT Results (2008, April 26). Next Step In Robot Development Is Child’s Play. ScienceDaily. Retrieved April 27, 2008, from: http://www.sciencedaily.com/releases/2008/04/080421162240.htm

 Robotter i brug i Japan, Korea og Norge:
http://news.bbc.co.uk/1/hi/sci/tech/1829021.stm
http://www.msnbc.msn.com/id/23438322/

http://www.msnbc.msn.com/id/21773646/

http://www.reuters.com/article/scienceNews/idUSL0719538520080207

 

 

~ af sorensvendsen på april 27, 2008.

Skriv et svar

Udfyld dine oplysninger nedenfor eller klik på et ikon for at logge ind:

WordPress.com Logo

Du kommenterer med din WordPress.com konto. Log Out / Skift )

Twitter picture

Du kommenterer med din Twitter konto. Log Out / Skift )

Facebook photo

Du kommenterer med din Facebook konto. Log Out / Skift )

Google+ photo

Du kommenterer med din Google+ konto. Log Out / Skift )

Connecting to %s

 
%d bloggers like this: