Bryan Reimer, research scientist, MIT AgeLab,
Associate Director of the New England University
So we're interested in trying to understand
how people work with the different levels of automation
in the car and how stress and arousal change as people
are parking with what is really their first experience
with some autonomous feature in the car.
Dashboard here is eye tracking, facial recordings
for promotional coding, cameras here recording forward
scene, cameras behind us recording what is going on
in the vehicle itself.
The car here, we have a fairly robust PC
and a lot of FireWire ports to receive camera video.
Physiology recording is done through this box right
here, heart rate, skin conductance, all
of this being time synchronized together
to try to provide an understanding of how
the driver behaves in real life driving
situation, on the roads around Boston.
As we're approaching here, I'm going to push the button,
tell the system to search for a parking spot.
[TURN SIGNAL CLICKING]
Tell me when it found a spot.
Tells me to pull forward to an appropriate spot
and then it tells me to put the car in reverse
and take my hands off.
The active park assist system takes over the steering,
and I'm still responsible for the gas pedal and the brake
as it steers me into the parking spot here.
It appears I'm approaching the car behind me.
Telling me pull forward now.
Hear the front sensors chime, back up a little more.
Telling me to pull forward again.
And telling me the park's finished up here.