You are on page 1of 2

'

The MIT Intelligent Wheelchair Project Developing a voice-commandable robotic wheelchair


MIT Tech TV

Vision: Designing Smart Wheelchairs of the Future


New engineering developments offer opportunities to develop smart wheelchair assistive technology that can improve the lives of many people who use wheelchairs. In our work, we are designing tomorrow's intelligent wheelchairs: we are developing a voice-commandable intelligent wheelchair that is aware of its surroundings so that it can assist its user in a variety of tasks. The goal of this smart wheelchair project is to enhance an ordinary powered wheelchair using sensors to perceive the wheelchair's surroundings, a speech interface to interpret commands, a wireless device for room-level location determination, and motor-control software to effect the wheelchair's motion. The robotic wheelchair learns the layout of its environment (hospital, rehabilitation center, home, etc.) through a narrated, guided tour given by the user or the user's caregivers. Subsequently, the wheelchair can move to any previously-named location under voice command (e.g., "Take me to the cafeteria"). This technology is appropriate for people who have lost mobility due to brain injury or the loss of limbs, but who retain speech. The technology can also enhance safety for users who use ordinary joystick-controlled powered wheelchairs, by preventing collisions with walls, fixed objects, furniture and other people. We envision that a voice-commandable wheelchair could improve the quality of life and safety of tens of thousands of users. Moreover, considerable health improvements and cost savings could accrue through the reduction or elimination of collision-induced injuries such as wounds and broken limbs. We are currently working closely with at The Boston Home, a specialized-care residence for adults with multiple sclerosis and other progressive neurological conditions. Our efforts are inspired and motivated by the insights, feedback, and needs of The Boston Home's residents, staff, and family members. Our team of faculty, students, and researchers come from several departments (Aeronautics and Astronautics; Electrical Engineering and Computer Science; Engineering Systems Division) and laboratories (the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the MIT AgeLab) from across MIT. Our efforts in developing this intelligent wheelchair span multiple domains, including robotics, artificial intelligence, machine learning, human computer interaction and user interface design, speech recognition systems, and the role of technology for people with disabilities and people who are getting older.

Additional Files
Demonstration video of narrated guided tour capability Presentation at The Boston Home Open House, April 12, 2010 (PDF)

Selected Publications
Hemachandra, S., Kollar, T., Roy, N., Teller S., "Following and Interpreting Narrated Guided Tours", International Conference on Robotics and Automation (ICRA), Shanghai, China, 2011 (PDF) Park, J., Charrow, B., Curtis, D., Battat, J., Minkov, E., Hicks, J., Teller, S., Ledlie, J., "Growing an Indoor Localization System," MobiSys 2010 (PDF) Kollar, T., Tellex, S., Roy, D., Roy, N., "Toward Understanding Natural Language Directions," HRI 2010 (PDF) Doshi, F., Roy, N., "Spoken Language Interaction with Model Uncertainty: An Adaptive Human-Robot Interaction System," Connection Science, November 2008 (PDF) Doshi, F., Roy, N., "Efficient Model Learning for Dialog Management," HRI 2007 (PDF)

Selected Media Coverage


Wheelchairs that listen, The Boston Globe, April 26, 2010 MIT adds robotics, voice control to wheelchair, Mass High Tech, September 19, 2008

Associates
Sponsors: Microsoft, Nokia. We gratefully acknowledge the support of Intel's University Program Office, and Intel's Software & Services Group. Interested Parties: The Boston Home, United States Department of Veterans Affairs. Related Groups at MIT: OIL/BMG (team members), RVSN, CSAIL.

Team Members

converted by Web2PDFConvert.com

Nicholas Roy

Seth Teller

Bryan Reimer

Yoni Battat

Finale Doshi

Sachithra Hemachandra

William Li

Javier Velez

Collaborators
Jim Glass, Director, MIT Spoken Language Systems Group Don Fredette, Adaptive Technology Coordinator, The Boston Home

Past Team Members


Michael Mason (Visiting Scientist), Queensland University of Technology, Australia Mish Madsen, Justin Colt, Minh Phan, Alex Lesman

Email us about this project: tbh @ csail dot mit dot edu.

converted by Web2PDFConvert.com

You might also like