Cookies   I display ads to cover the expenses. See the privacy policy for more information. You can keep or reject the ads.

Video thumbnail
In this video, we demonstrate that it's
possible to analyze cardiac pulse from regular videos
by extracting the imperceptible motions of the head caused
by blood flow.
Recent work has enabled the extraction of pulse
from videos based on color changes in the skin
due to blood circulation.
If you've seen someone blush, you
know that pumping blood to the face
can produce a color change.
In contrast, our approach leverages
is perhaps more surprising effect.
The inflow of blood doesn't just change the skin's color.
It also causes the head to move.
This movement is too small to be visible with the naked eye,
but we can use video amplification to review it.
Believe it or not, we all move like bobbleheads
with different motions at our heart rate,
but at a much smaller amplitude than this.
Now, you might wonder what causes the head
to move like this?
In each cardiac cycle, the heart's left ventricle
contracts and injects blood at high speed to the aorta.
During the cycle, roughly 12 grams of blood
flow to the head from the aorta by the carotid arteries
on either side of the neck.
It is this influx of blood that generates a force on the head.
Due to Newton's third law, the force of the blood on the head
equals the force of the head acting on the blood,
causing a reactionary, cyclical head movement.
To demonstrate this process, we coded a toy model
using a transparent mannequin head, where rubber tubes stand
for simplified arteries.
Instead of pumping blood, we will pump compressed air
provided by this air tank.
And I can release the air using this valve.
Now, watch what happens as I open
and close the valve once a second,
similar to a normal heart rate.
Ready?
Here.
This motion is fairly similar to the amplified motion
of real heads that we've seen before.
We exploit this effect to develop
a technique that can analyze pulse
in regular videos of a person's head.
Our method takes an input video of a stationary person
and returns a one-dimensional signal corresponding
to the head motions.
From this signal, we can extract an average pulse rate,
as well as beat locations for deeper clinical analysis.
We begin by locating the face using a face detector
and selecting feature points within the area.
The feature points are tracked from frame
to frame of the video using the Lucas-Kanade tracking
algorithm.
We use the vertical, or y component,
of each of the feature point trajectories for our analysis.
Next, we temporally filter the signals
through a pass band encompassing a normal pulse
range, while excluding extraneous motions
like respiration.
We decompose the multi-dimensional motion
of the head described by the trajectories
into submotions using Principle Component Analysis, or PCA.
PCA returns the main directions along which the head moves.
We project the motion of the head onto each component
and choose the signal with the clearest dominant frequency.
We use the dominant frequency to obtain an average pulse rate.
Finally, we perform peak detection on the chosen signal
to obtain beat locations for further analysis,
such as heart rate variability.
We tested our method on different people varying
in skin tone and gender and were able to get nearly exact pulse
rates compared to an ECG device.
In addition, our method produced similar beat length
distributions to the ECG, an exciting result that
shows that we can capture more subtle information
about the heart than just an average rate.
Finally, our method is robust to different views of the head.
We obtained a pulse from a sleeping newborn,
from the back of a subject's head,
and even when a person is wearing a mask.