Researchers teach 'Second Life' avatar to think
From Yahoo.com By MICHAEL HILL, Associated Press Writer 2 hours, 11 minutes ago
TROY, N.Y. - Edd Hifeng barely merits a second glance in "Second Life." A steel-gray robot with lanky limbs and linebacker shoulders, he looks like a typical avatar in the popular virtual world.
But Edd is different.
His actions are animated not by a person at a keyboard but by a computer. Edd is a creation of artificial intelligence, or AI, by researchers at, who endowed him with a limited ability to converse and reason. It turns out "Second Life" is more than a place where pixelated avatars chat, interact and fly about. It's also a frontier in AI research because it's a controllable environment where testing intelligent creations is easier.
"It's a very inexpensive way to test out our technologies right now," said, director of the Rensselaer Artificial Intelligence and Reasoning Laboratory.
Bringsjord sees Edd as a forerunner to more sophisticated creations that could interact with people inside three-dimensional projections of settings like subway stops or city streets. He said the holographic illusions could be used to train emergency workers or solve mysteries.
But first, a virtual reality check.
Edd is not running rampant through the cyber streets of "Second Life." He goes only where Bringsjord and his graduate students place him for tests. He can answer questions like "Where are you from?" but understands only English that has previously been translated into mathematical logic.
"Second Life" is attractive to researchers in part because virtual reality is less messy than plain-old reality. Researchers don't have to worry about wind, rain or coffee spills.
And virtual worlds can push along AI research without forcing scientists to solve the most difficult problems — like, say, creating a virtual human — right away, said Michael Mateas, a computer science professor at the University of California at Santa Cruz.
Researching in virtual realities has become increasingly popular the past couple years, said Mateas, leader of the school's Expressive Intelligence Studio for AI and gaming.
"It's a fantastic sweet spot — not too simple, not too complicated, high cultural value," he said.
Bringsjord is careful to point out that the computations for Edd's mental feats have been done on workstations and are not sapping "Second Life" servers. The calculations will soon be performed on a supercomputer at Rensselaer with support from research co-sponsor IBM Corp.
Operators of "Second Life" don't seem concerned about synthetic agents lurking in their world. John Lester,operations manager for Linden Lab, said the San Francisco-based company sees a "fascinating" opportunity for AI to evolve.
"I think the real future for this is when people take these AI-controlled avatars and let them free in 'Second Life,'" Lester said, " ... let them randomly walk the grid."
That is years off by most experts' estimations. Edd's most sophisticated cognitive feat so far — played out in "Second Life" and posted on the Web — involves him witnessing a gun being switched from one briefcase to another. Edd was able to infer that another "Second Life" character who left the room during the switch would incorrectly think the gun was still in the first suitcase.
This ability to make inferences about the thoughts of others is significant for an AI agent, though it puts Edd on par with a 4-year-old — and the calculus required "under the hood" to achieve this feat is mind-numbingly complex.
A computer program smart enough to fool someone into thinking they're interacting with another person — the traditionalfor AI researchers — has been elusive. One huge problem is getting computers to understand concepts imparted in language, said Jeremy Bailenson, director of the Virtual Human Interaction Lab at .
AI agents do best in tightly controlled environments: Think of automated phone programs that recognize your responses when you say "operator" or "repair."
Bringsjord sees "Second Life" as a way station. He eventually wants to create other environments where more sophisticated creations could display courage or deceive people, which would be the first step in developing technology to detect deception.
The avatars could be projected at RPI's $145 million Experimental Media and Performing Arts Center, opening in October, which will include spaces for holographic projections. Officials call them "holodecks" in homage to the virtual reality room on the "Star Trek" television series.
That sort of visual fidelity is many years down the line, just like complex AI. John Kolb, RPI's chief information officer, said the best three-dimensional effects still require viewers to wear special light-polarizing glasses.
"If you want to do texture mapping on a wall for instance, that's easy. We can do that today," Kolb said. "If you want to start to build cognitive abilities into avatars, well, that's going to take a bit more work."