Of Machines and Men

Featured story image

Working to narrow the intelligence gap between men and machines, researchers are gaining a new perspective on the species after which they’re modeling their creations.

Featured story figure

RoboCanes Andreas Seekircher and Justin Stoecker, B.S. ’09, program virtual soccer bots. “The vision,” says their faculty advisor, Ubbo Visser, “is to beat the human soccer world champion with a team of fully autonomous humanoid robots by the year 2050.”

Imagine a robot that can look into a mother’s eyes and learn like an infant. An android with the intellect and agility to compete on a real soccer team. Or an automaton that can navigate the perilous depths of the sea solo. The quest for artificial intelligence has long been a holy grail of science. The desire to create machines that look, think, and act human—even superhuman—has been the driving force behind a vast body of research. At the University of Miami, researchers and students from a wide range of fields are working on robotics projects that seek to accomplish these goals and more. And as they work to build better robots, they find they are unraveling the mysteries of human nature itself.

Building a Tot Bot

Although the bond between mother and baby may be pure intuition, understanding that connection in a scientific way is much trickier. For University of Miami researchers, unlocking that mystery is the key to building a baby robot that thinks and moves like a real infant.


Data captured at the University of Miami from babies like 3-month-old Audrey, wearing a motion capture suit and gazing at her mom, are helping scientists to program a “baby” robot.

Aided by $350,000 from a National Science Foundation grant, Daniel Messinger, associate professor of psychology in the UM College of Arts and Sciences, and a team of graduate students are trying to get a good enough understanding of how babies learn intricate social and motor skills to apply them to a robot. Their findings from live interactions between mothers and babies will be used to help a team of computer scientists from the University of California in San Diego (UCSD) program Diego-San, a four-foot, 66-pound humanoid baby robot that is capable of learning social skills and mimicking human expressions.

“Infants do very complicated things,” says Messinger, “like figure out who other people are, how to pay attention to them, reach for objects, learn to walk, smile at other people. We have very little idea how we can get a robot to develop those kinds of skills. But infants seem to be able to handle those problems by the time they are a year old, in a very robust manner. So the question is, how are they doing it? And how can we learn about how they’re doing it by designing something that can do something similar?”

One day, says Messinger, this baby robot technology they’re developing could be used to help diagnose children with autism, or to interact with other children or the elderly.

The first phase of research, led by Messinger, focused on examining face-to-face interactions between 13 mothers and babies from 1 to 6 months old as they played with each other during five-minute intervals. According to the study, babies and mothers develop a pattern to their play that becomes more predictable as the babies get older.

“One of the things we learned is that there is an increase in smile turn-taking as the baby develops,” explains Messinger, who has secondary appointments at the Miller School of Medicine and the College of Engineering. “When baby smiles, mom tends to smile. When mom smiles, baby can either smile or look away, and what they do changes with age. Babies become more predictable in spending less time looking at their moms.”

The researchers are also conducting movement studies. “Solving the question of how the robot moves is akin to solving the question of how babies move,” he says.

To better understand infant motor skills, UM graduate assistants Whitney Mattson and Juan Artigas are using a custom-made baby motion capture suit to study how human infants interact with their mothers.

In a small studio on campus one recent afternoon, Mattson and Artigas gingerly dress 3-month-old Audrey Landoll in a specially designed onesie attached to wires and about 40 small markers that light up. As little Audrey and her mom engage in playful activities, ten motion-capture, infrared cameras track the baby’s every move. Tiny video cameras placed inside her hat and in a headband worn by her mom record the session from both of their viewpoints.

“We’re looking at when the baby gazes at her mom, and we’re looking at how the baby reaches,” explains Artigas as he watches the session from a video screen outside the studio.

The studies are critical to scientists at UCSD’s Machine Perception Laboratory who are building the baby robot in conjunction with Kokoro Dreams. Diego-San’s head has 40 “tendons” that make facial expressions, an audio speaker for a mouth, high-definition cameras for eyes, and an accelerometer for ears. It also has 47 joints in the body, and its hands can hold a water bottle.

Javier Movellan, the project’s principal investigator at UCSD, says Diego-San is being designed to recognize facial expressions, as well as to learn to point and reach for objects, smile, and recognize its caregiver. One of the most challenging tasks has been to create a face that’s realistic but not “creepy” to real people, explains Movellan, because robots that look too much like humans can make people feel uncomfortable, he points out.

Movellan says Messinger’s studies at UM are helping him program the robot to move and think more like a human baby and “giving us ideas about what our questions should be.”

Messinger’s research, notes Movellan, indicates that babies move their entire bodies and smile when they want something their mother is holding and may do so because they are trying to influence the mother to move the object herself.

“So instead of programming our robot to reach,” Movellan says, “we are going to program the robot to learn to control the object—and we hope in the presence of human caregivers the robot would smile and coo and move its leg because that is also a way to move the object.”

Ultimately, the data could be used to help mothers better understand their babies.

“These studies make real the enormity of what it’s like to become a person,” says Messinger.

Underwater Intelligence

A robot doesn’t have to be humanoid, but it does have to be able to perform some relatively complex tasks autonomously. One robot being researched by Shahriar Negahdaripour, a professor in the UM College of Engineering’s Department of Electrical and Computer Engineering and director of its Underwater Vision and Imaging Lab (UVIL), is an electronically powered underwater Remotely Operated Vehicle, or ROV. Based on technology developed at UM, it can be programmed to explore, survey, and map an undersea world independent of a human operator. The roughly 100-pound, two-and-a-half-by-two-foot ROV, explains Negahdaripour, resembles the type of equipment used to document the sunken Titanic in a televised National Geographic expedition. (Much larger and heavier versions help install and repair offshore oil rigs.)


This ROV, made for engineering professor Shahriar Negahdaripour, is the Ferrari of robots. Its stereovision sonar, echo sounder, and other subsea capabilities allow it to “see” in the dark and capture acoustic imagery of events such as fish spawning.

The first ROV of Negahdaripour’s group enabled a team of marine biology and geology researchers directed by Pamela Reid, Ph.D. ’85, associate professor at the Rosenstiel School of Marine and Atmospheric Science, to collect a large volume of reef video without having to employ human divers. Novel computer programs the two groups developed collaboratively were used to create spatially accurate mosaics of the reefs. These computerized mapping and analysis tools, which allowed them to assess damage to a reef community within the National Oceanic and Atmospheric Administration Florida Keys Marine Sanctuary during the 2005 hurricane season, are used to study and monitor the health of a coral community as well as the seasonal and long-term impacts of external factors. “Our team won the 2009 Project of the Year Award from the Department of Defense’s Strategic Environmental Research and Development Program, which sponsored the project,” notes Negahdaripour.

While traditional ROVs are operated much like remote-control cars, Negahdaripour’s research is aimed at developing fully or highly automated ROVs. His team has already developed a multi-camera technology for the inspection of ship hulls for the U.S. Navy, UVIL’s largest research sponsor. Originally designed to find explosives, the technology also has applications in narcotics detection and regular maintenance of ships, bridges, dams, and offshore structures, thereby keeping human divers out of potentially hazardous conditions such as highly polluted waters.

He is now working on integrating information from optical and sonar imaging systems and other sensory capacities. “You might have good [underwater] visibility here in Florida,” he explains, “but if you go to many other U.S. ports and harbors, you may often have next-to-zero visibility. So we are working at capabilities based on sonar imaging because it has the ability to penetrate turbid and muddy waters.”

These novel technologies will be implemented in a newer ROV, a custom-built, electric-powered model from Teledyne Benthos equipped with stereovision with megapixel digital cameras and 2-D and 3-D sonar systems. This ROV could go down to about 500 feet, limited mainly by the length of its umbilical cable, which contains the electrical wires and fiber optics needed for power as well as optical and sonar video transmissions. Its acoustic positioning system would enable it to be tracked as it conducts tasks ranging from seafloor mapping to underwater pipeline inspection to shallow-water bridge or ship inspection.


Professor Negahdaripour, left; students Reza Babaee and Murat Aykin, standing; and Darren Moss, of Teledyne Benthos, discuss the ROV’s pool performance in preparation for a breakthrough fish-tracking mission with a team of marine scientists.

In his lab, Negahdaripour and his students are developing software to automate data acquisition and online processing. They conduct experiments on the optical and sonar technology in a 6-foot-deep water tank strewn with sand, coral, and other objects intended to mimic the seafloor. Ph.D. candidate Murat Aykin, visiting M.S. candidate Reza Babaee, from the Technical University of Munich, and Gulliver Preparatory School intern Shayanth Sinnarajah help collect and analyze the data recorded by a computer connected to underwater cameras.

“We are developing new technologies that have real application for harsh environments where optics have limited use,” Aykin explains.

By applying computer vision technologies, adds Negahdaripour, “We are advancing the sonar technology that’s been around for decades, most notably used by the military. No one else in the world is currently doing the research we are doing.”

Quick-Thinking Kickers

At UM’s College of Arts and Sciences, Ph.D. candidates Andreas Seekircher, Saminda Abeyruwan, and Justin Stoecker, B.S. ’09, are programming virtual robot soccer players to compete with other teams around the world.

“You can easily spend eight hours a day on this,” says Seekircher, as he and his fellow grad students sit in front of a flat-screen television one afternoon and watch their computer-generated robots play soccer.

As entertaining as the game playing may be, it could one day lead to models for real-life intelligent humanoid robots. The students spend hours developing algorithms in an effort to program the virtual robots to act autonomously. “There are a lot of unsolved problems,” Seekircher admits. “You can’t just look in a book [for answers].”

The students are part of a team called the UM RoboCanes, one of two American groups that competed in last year’s RoboCup, an international robotics competition that seeks to advance artificial intelligence through soccer competitions featuring both virtual and real robots.

The team’s faculty advisor is Ubbo Visser, a research associate professor in the college’s Department of Computer Science who competed with teams at RoboCup for nearly a decade before coming to UM from Germany’s Universität Bremen in 2008. He conducts research in artificial intelligence and gaming, particularly complex sports games.

Visser insists soccer is an ideal game for developing and testing robots because the rules are clear, and researchers must program the robots, whether virtual or real, to make quick decisions, communicate with other players, and essentially behave like human beings.

Ultimately, Visser hopes to apply his students’ programming systems for the virtual robots to ten real robots they plan to purchase next year, pending funding. The technology used on the robots, in turn, has the potential for many other practical applications, he says, not unlike the resources that have emerged from RoboCup Rescue, a similar competition that has led to the creation of lifesaving rescue robot technology used in real-life disasters.

“Soccer is a test bed,” Visser says. “We’re not trying to develop an artificial soccer player. We would like to understand what kind of technologies we need when it comes to real-time, dynamic situations—such as a system in a car that helps a driver make decisions about whether to change a lane or to hit the brakes.”

If the technology works, Visser adds, it could be advanced in other fields such as space exploration. “If you have multiple robots on Mars, what do you do with them?” he asks. “They need to act together and make decisions [like the soccer players]. The problems are similar.”

Last year RoboCanes made it to the RoboCup quarterfinals. This April, after the European Championships in Germany, they won the largest event in Asia, the IranOpen. They hope to further their record at the 2011 World Cup in Istanbul (July 5 to 11).

One of the lessons the students already have gotten from the experience: People can learn more about themselves from machines.

“Can a robot have emotions?” Stoecker says. “This all comes back to how we understand ourselves. Whether you’re a scientist or an art teacher, anyone can appreciate how our creations reflect how we perceive ourselves.”

(From Miami Magazine)


Share it with others

More New Knowledge Stories