|Current Issue||Past Issues||About YSM||Subscriptions||Advertisements||Contact Us|
|78.2 - Winter 2004|
|Search YSM Articles|
> Winter 2004 > News
Using Robots to End AI: Artificial Inference
By Hannah Yoon
Nico, an upper-torso humanoid robot, is designed to socially interact with children. (Credit: Andrew Lovett)
Andrew Lovett ’03 and Assistant Professor of Computer Science Brian Scassellati, recipients of the Best Paper Award for “Using a Robot to Reexamine Looking Time Experiments” at the third International Conference on Development and Learning (ICDL) in San Diego, CA, used robotic experiments to validate an alternative theory of why infants spend longer times looking at certain objects.
The researchers showed that infants do not look longer at a particular stimulus because children consciously decide to look at a surprising event. Nico, the model robot, played an instrumental role in this proof, allowing researchers to follow the computerized visual processing that occurred in Nico as the robot faced the same constraints and cues facing actual infants.
Sixteen computers control Nico, who observes the world through four small cameras that provide both wide and narrow fields of view. Nico’s detection of color, touch, and motion are incorporated into data used to direct its social behavior.
Scientists can test cognitive theories or inferences by reading how feedback from the real world interacts with a robot’s cognitive operations. Unlike a human clinician, Nico can also be controlled to interact objectively with real children, allowing scientists to observe how different children respond to Nico’s behavior. Lovett observed that many researchers at the ICDL shared a profound interest in creating models of the human cognitive process, in hopes of obtaining a better understanding of human brain development. “The ideal final product would be a humanoid robot capable of learning everything a human knows, and thus reaching human-level cognitive capabilities.”Reference
Copyright 2014 Yale Scientific Publications, Inc. - Disclaimer