OEM & Lieferant Ausgabe 2/2019 - OEM & Supplier 2/2019 by VEK Publishing

132 Engineering Partner Deep learning in driver assistance systems Accident risk caused by driver distraction: vision-based CNN system automatically detects mobile phone usage, eating and drinking By Annika Mahl, Marketing Officer/Public Relations, ARRK Engineering According to a report by the World Health Organization (WHO), each year about 1.35 million people die in traffic accidents and another 20 to 50 million are injured. One of the main causes is driver inattention. Consequently, many automotive manufacturers offer driver assistance systems that detect tiredness. But it’s not just microsleep at the wheel that causes accidents. Talking or texting on smartphones, and eating or drinking while driving cause high risk. Until now, driver assistance systems have been unable to identify these activities. ARRK Engineering has therefore run a series of tests towards automatically recognizing and categorizing mobile phone use and eating/drinking. Im- ages were captured with infrared cameras and used for machine learning by several Convolutional Neural Network (CNN) systems. This created the basis for a driver assistant. For years, the automotive industry has in- stalled systems that warn in case of driver fatigue. These driver assistants analyze, for example, the viewing direction of the driver, and automatically detect deviations from normal driving behavior. “Existing warning systems can only correctly identify spe- cific hazard situations,” reports Benjamin Wagner, Senior Consultant for Driver Assis- tance Systems at ARRK Engineering. “But during some activities like eating, drinking and phoning, the driver’s viewing direction remains aligned with the road ahead.” For that reason, ARRK Engineering ran a series of tests to identify a range of driver postures so systems can automatically detect the use of mobile phones and eating or drinking. For the system to correctly identify all types of visual, manual and cognitive distraction, ARRK tested various CNN models with deep learning and trained themwith the collected data. Creation of the first image dataset for teaching the systems In the test setup, two cameras with active infrared lighting were positioned to the left and right of the driver on the A-column of a test vehicle. Both cameras had a frequency of 30 Hz and delivered 8-bit grayscale images at 1280 x 1024 pixel resolution. “The cameras were also equipped with an IR long-pass filter to block out most of the visual spectrum light It’s not just fatigue at the wheel that causes accidents. Talking on smartphones or eating while driving also cause high risk. ARRK Engineering is part of the international ARRK Group and specializes in all services relat- ing to product development. With the expertise in Electronics & Software, CAE & Simulation, Materials, Acoustics, Composites, Car Body, Pow- ertrain, Chassis, Interior & Exterior, Optical Sys- tems, Passive Safety and Thermal Management, the company develops integrated and independent products for its costumers, supporting them with many years’ experience as strategic development partners. Together with the sister companies, ARRK Engineering is working to implement product developments, from virtual development to prototypes and small series production. The ARRK Engineering Divisionoperates worldwide from sites in Germany, Romania, the UK, Japan, and China. The Engineering Division is headquar- tered at P+Z Engineering GmbH in Germany. ARRK Engineering employs over 1,200 staff members. ARRK Engineering Image: © Dusan Petkovic/shutterstock.com

RkJQdWJsaXNoZXIy MjUzMzQ=