Cookies   I display ads to cover the expenses. See the privacy policy for more information. You can keep or reject the ads.

Video thumbnail
The robust robotics group under the leadership
of professors Nick Roy and Seth Teller at the MIT Computer
Science and Artificial Intelligence Laboratory
is developing a voice commandable autonomous
wheelchair.
This video shows a research prototype
under development since 2005.
The long term goals of the project
are to develop intelligent assistant technology that
can work with a variety of people,
and the wheelchair is one example
of this kind of technology.
Our target population consists of people
who have limited physical mobility and limited
physical control.
By adding a voice commandable interface and intelligent
autonomy to a person's wheelchair,
the chair can then be used by people
who may not have the necessary level of physical ability
to control the wheelchair using a standard interface,
such as a joystick.
We are developing a range of capabilities
from simply augmenting existing wheelchairs
to developing a fully autonomous powered wheelchair.
We are collaborating with the Boston Home, a specialized care
residence for adults with progressive neurological
diseases.
And working with TBH, we plan to develop wheelchairs
that can be used by more TBH residents
without assistance from a caregiver,
increasing the independence and quality of life
of the residents.
Our prototype is a commodity powered wheelchair.
But we have added a number of components, such as a low power
laser range scanner to detect obstacles and provide
fine grained position tracking for autonomous control.
We have added a microphone to listen
to the wheelchair user and speakers
to respond using speech.
On this wheelchair, we have added a computer
for sensory processing and intelligent decisioning making.
When the wheelchair does not need
to provide fully autonomous navigation and control,
we can deploy a small PDA tablet instead of a full sensor
suite and computer.
The smaller tablet can easily be added to existing wheelchairs,
but still allows us to provide a voice interaction
so that the wheelchair can give information to its user,
for example, about daily activities.
The tablet also provides Wi-Fi connectivity
that allows it to estimate the wheelchair
position at a room level and the estimated
position to caregivers.
So the nurses can more easily stay aware of the wheelchair
users health and needs.
The onboard computer or tablet can process speech and respond
via speech to user queries.
In the future, we plan to use the tablet as an alternate way
to interact with wheelchair users
for populations who have difficulty with speech.
For full autonomous navigation and control,
the wheelchair can use a laser range scanner
to detect obstacles and estimate its position in the map.
This Hokuyo laser range scanner sends out
a plane of infrared light, which it
uses to measure the distance to obstacles around it.
By comparing the positions of obstacles around the wheelchair
to the positions of obstacles in a map,
the wheelchair can estimate its position in a map
as it moves around.
The wheelchair can use the same laser sensor
to explore new environments and build new maps.
Even if an object is not in the map,
the sensor allows the wheelchair to detect the new obstacle
and replant around it.
The technologies for autonomous navigation and control
are not new, although, they have not been widely deployed
on autonomous wheelchairs.
Our project has a number of innovative aspects
that are designed to make autonomous navigation
and control more easily deployed and adopted
by the general public.
For example, we are building on well-known robot mapping
algorithms that can learn the floor plan of environments
and developing algorithms that allow
a wheelchair to learn new environments
and build maps from a narrative guided tour.
The wheelchair can take instructions
from a human caregiver to learn the names of places in the map.
This natural language understanding
will make it much easier for caregivers
to deploy the wheelchair in new environments
and make it easier for users to use the wheelchair.
Once the chair understands the user's desired destination,
it navigates a route through the previously learned map
to that intended destination.
Go down the hallway past the door and along the railing,
until you get to the refrigerator
in the kitchen on your right.
We are also developing algorithms
that will allow the robot to be guided
through unfamiliar environments.
For example, the robot can follow a simple spoken
directions to an unlabeled location in the map
by reasoning about how people use everyday objects
as landmarks for navigation.
For a user with difficulty communicating,
the chair can learn the user's behavior and mobility patterns
over time in order to improve ease of use.
For example, when the user has a daily routine,
the chair can anticipate the user's desired destinations
and plan routes there proactively,
reducing the user burden fully specifying each destination.
As we add more wheelchairs with knowledge of where they are,
we can give residents of the Boston Home
the ability to find each other.
The Intelligent Wheelchair Project.