Each year in the U.S. there are six million car crashes, resulting in nearly $160 billion in expenses. These accidents are the number one cause of death between the ages of four and 34, and 93% of them are due simply to human error.
“You either need to make a better driver, or take the driver and human error out of the equa-
tion all together,” says Brooke Williams, advanced driver assistance systems (ADAS) business
manager at Texas Instruments (TI). “We have the technology that we believe can help reduce
the death and crash rates significantly.”
The TDA2x System-on-Chip (Soc) is an ADAS that is paving the way towards the autono-
mous car, and taking human error out of the calculation. With a full board solution released in
October 2013, TI is moving one step closer to their vision of enabling autonomous automobiles.
“We can cover nearly all of the components that are needed in a lot of these automotive
systems,” says Williams. TI has been quietly making major investments over the past eight years with the vision of
enabling autonomous vehicles. “Everything we are doing today is to support that vision,” he adds.
The three applications required to enable
an autonomous car include a front camera,
surround view, and sensor fusion.
1. The front camera runs entry level algorithms and allows for high
beam assist, lane departure warnings, traffic sign recognition, and, as the
algorithms become more advanced, pedestrian detection and forward
2. Rooted in park assistance, surround view is a slower speed applica-
tion with four cameras (front, rear, and two side) that gives the driver a
bird’s eye view once the images are fused together.
3. A relatively new application, sensor fusion takes preprocessed data
from camera and radar sensors, and fuses them together to enable more
intelligent decisions. This technology comes into play when cars take
control, applying brakes and making steering maneuvers. The application
also supplies the redundancy that automotive manufacturers are looking
for in autonomous cars.
“These cameras are fairly complex in nature so we’ve integrated a lot of
IT interfaces and signal processing cores on a single SoC to integrate at a
Advanced Driver Assistance Paves
the Way for Autonomous Car
by Melissa Fassbender, Associate Editor
THE BACKSEAT DRIVER: Vision AccelerationPac
The new Vision AcclerationPac, released in conjunction with the new family of devices, is designed to run low- to mid-level
vision processing functions. Features include:
• One or more multiple embedded vision engines (EVEs).
• An optimized vector coprocessor.
• A 32-bit programmable RISC core.
This addition allows more advanced driver assistant system algorithms to be run simultaneously,
offloading the DSP and ARM cores for enhanced performance and a low power output.
Delivering more than eight times the compute performance of other accelerators, Williams says,
“we believe this will be the industry’s most powerful and flexible accelerator on the market. It’s a
perfect complement to our existing DSPs, and works very well on ADAS applications.”
The accelerator further separates itself by offering the flexibility of a programmable accelerator.
“Programmability is critical in ADAS because it allows our customers to differentiate, writing different features and algorithms for following the industry trends and innovating,” says Williams. “Users
will be able to do new things that other fixed function accelerators can’t enable.”
Front camera, surround view, and sensor fusion
applications enable the autonomous car.