You are on page 1of 5

NASA buys life-like humanoid as tour guide

A life-like robot, which speaks more than a dozen languages and has a pawky sense
of humour, has been bought by NASA to become a robotic tour guide.

The robot named RoboThespian is powered by compressed air with sinews of


aluminium. It was fabricated by Cornish firm Engineered Arts in Britain that has
only seven members of staff, reports the Daily Mail. Unlike his more masculine R2
Robonaut, which will be sent to the International Space Station later this year,
RoboThespian will remain firmly earth-bound.

The RoboThespian comes in three variations - the Lite, Standard and Deluxe
versions. It can also be hired for events.

On the firm's website, the Deluxe version comes in at 79,500 pounds, which gives
access to all of the robot's more advanced features such as powered legs and
customisable content.

Nasa is to pay 70,520 pounds for the robot to guide visitors around its Kennedy
Space Centre at Cape Canaveral.

RoboThespian comes with standard greetings and impressions, to which you can add
your own recorded sequences or bespoke content. RoboThespian made its official
debut at the Association of Science-Technology Centres conference in Los Angeles
in 2007 where it performed in a series of film clips and interacted with the audience.
It comes with a touchscreen interface that lets users pre-programme a series of
movements.
Researchers Build Bionic Eye That Could Give
You Superhuman Vision
Movie characters from the Terminator to the Bionic Woman use
bionic eyes to zoom in on far-off scenes, have useful facts pop into their
field of view, or create virtual crosshairs. Off the screen, virtual displays
have been proposed for more practical purposes — visual aids to help
vision-impaired people, holographic driving control panels and even as a
way to surf the Web on the go.The device to make this happen may be
familiar. Engineers at the University of Washington have for the first
time used manufacturing techniques at microscopic scales to combine a
flexible, biologically safe contact lens with an imprinted electronic circuit
and lights.“Looking through a completed lens, you would see what the
display is generating superimposed on the world outside,” said Babak
Parviz, a UW assistant professor of electrical engineering. “This is a very
small step toward that goal, but I think it’s extremely promising.” The
results were presented today at the Institute of Electrical and Electronics
Engineers’ international conference on Micro Electro Mechanical Systems
by Harvey Ho, a former graduate student of Parviz’s now working at
Sandia National Laboratories in Livermore, Calif. Other co-authors are
Ehsan Saeedi and Samuel Kim in the UW’s electrical engineering
department and Tueng Shen in the UW Medical Center’s ophthalmology
department.There are many possible uses for virtual displays. Drivers or
pilots could see a vehicle’s speed projected onto the windshield. Video-
game companies could use the contact lenses to completely immerse
players in a virtual world without restricting their range of motion. And for
communications, people on the go could surf the Internet on a midair
virtual display screen that only they would be able to see.The prototype
device contains an electric circuit as well as red light-emitting diodes for a
display, though it does not yet light up. The lenses were tested on rabbits
for up to 20 minutes and the animals showed no adverse effects.A full-
fledged display won’t be available for a while, but a version that has a
basic display with just a few pixels could be operational “fairly quickly”.
Scientists Extract Images Directly From Brain
Researchers at Japan’s ATR Computational Neuroscience Laboratories have
developed a system that can “reconstruct the images inside a person’s mind and
display them on a computer monitor.”

According to the researchers, further development of the technology may soon


make it possible to view other people’s dreams while they sleep.

The scientists were able to reconstruct various images viewed by a person by


analyzing changes in their cerebral blood flow. Using a functional magnetic
resonance imaging (fMRI) machine, the researchers first mapped the blood flow
changes that occurred in the cerebral visual cortex as subjects viewed various
images held in front of their eyes. Subjects were shown 400 random 10 x 10 pixel
black-and-white images for a period of 12 seconds each. While the fMRI machine
monitored the changes in brain activity, a computer crunched the data and learned
to associate the various changes in brain activity with the different image designs.

Then, when the test subjects were shown a completely new set of images, such as
the letters N-E-U-R-O-N, the system was able to reconstruct and display what the
test subjects were viewing based solely on their brain activity.

The researchers suggest a future version of this technology could be applied in the
fields of art and design — particularly if it becomes possible to quickly and
accurately access images existing inside an artist’s head. The technology might also
lead to new treatments for conditions such as psychiatric disorders involving
hallucinations, by providing doctors a direct window into the mind of the patient.

ATR chief researcher Yukiyasu Kamitani says, “This technology can also be applied
to senses other than vision. In the future, it may also become possible to read
feelings and complicated emotional states.

You might also like