Any application that uses inertial sensors, such as
accelerometers and gyroscopes, magnetic sensors, or
environmental sensors such as pressure sensors, is a
potential fit for sensor fusion.
“Applications like smartphones, connected cars,
wearables, and Internet of Things (IoT) devices are logical
choices for sensor fusion,” explains Karen Lightman,
MEMS & Sensors Industry Group executive director.
“That’s because sensor fusion, by its definition, intelligently
allocates sensor data to the application processor.”
An Evolving Term
Sensor fusion was initially just an expression for
inertial navigation. In inertial navigation, several sensors
(accelerometer, magnetometer, and gyroscope) needed to be
combined to determine position and attitude, since any single
sensor was incapable of providing the necessary information.
Then, this was applied to smartphones, when the
gyroscope was first added by Steve Jobs at Apple.
“Suddenly sensor fusion was possible there too,” explains
Kevin Shaw, Algorithmic Intuition founder and CTO.
“However, with the decidedly non-military-grade sensors
in consumer smartphones, it was clearly not possible to
calculate position, so they were used for rotation (technically
called ‘attitude determination’) instead, and soon the term
sensor fusion was used.”
For several years, sensor fusion just meant calibrating
the sensors and computing rotation/attitude, yet in the
past couple years, the term sensor fusion has expanded to
include adding other sensors into a given result.
“The old attitude determination code isn’t that interesting
anymore,” says Shaw, “and more sensors are being
combined for new things.”
Now, for example, GPS is fused with barometers (for
elevation) and with accelerometers, gyroscopes, and
magnetometers to determine indoor position and pedestrian
navigation. This might also be fused with other sensors to
determine if a person is indoors or in a car.
“Basically, now more sensors are being used to generate
higher confidence results than could be accomplished with
a single sensor,” explains Shaw.
There is also a trend towards contextual awareness. “It’s
no longer good enough to know if someone has walked 500
steps,” says Shaw. “Now they want to know how long they
were in the car, how many flights of stairs they walked, how
many minutes they spent on the train, whether they played
tennis or golf (or tango), and how many minutes they spent
Each piece of information requires a new algorithmic
approach and additional sensors.
Because today’s sensor fusion algorithms are localized,
they tend to be under the control of one company. Even so,
there are significant challenges with defining and measuring
Quality of Results (QoR).
“Part of the problem is that we are trying to measure
subjective items. For instance, what is the difference between
walking and running?” explains Tim Saxe, QuickLogic CTO.
Trying to fuse data from multiple sensors over wide areas,
for example, water flow into a house with washing machine
activity, will require solving mundane issues, such as the
units used, protocol issues, and quality of data issues.
Historically, these sorts of issues have been solved by
Sensor fusion is the process by which data from several different sensors are “fused” to improve application
or system performance far beyond the
data that one sensor could determine
alone. Every year, new advances in sensor
technology and processing techniques, in
addition to improved hardware, make the
real-time fusion of data a reality.
By Kaylie Duffy, Associate Editor
The Evolution of Sensor Fusion