Development of a Test Method for Verifying a Vehicle Front Camera using External Stimulation

Abstract

Advanced driving assistance systems are increasingly being equipped in vehicles. These functions obtain their required data from the vehicle's environmental sensors (radar, lidar, and camera). The correct functioning of the vehicle's environmental sensor system is decisive in ensuring the high reliability of advanced driving assistance systems. A literature review shows that the vehicle's self-diagnosis is insufficient to guarantee the correct functionality of the vehicle's environmental sensors. Similarly, the vehicle's environmental sensors are only tested to a limited extent during the Periodical Technical Inspection, which also occurs too infrequently to guarantee the correct functionality of the vehicle's environmental sensors.

This work presents a method for testing a vehicle's front camera. The method is based on the input-process-output model. Several simulations (traffic scenarios) were created in CarMaker to simulate input for a vehicle's front camera. The vehicle's reaction can be observed as the output using the high beam assistance system.

For the application of the method, different concepts for stimulating the front camera, activating the high beam assist system, and observing the output were developed in this work. A concept was selected for practical tests after a utility analysis conducted with experts involved in Periodical Technical Inspections.

The tests have shown that the vehicle camera can be addressed directly through external simulation and that the vehicle reacts accordingly to the simulated situation. In 10 out of 11 test cases, a correct vehicle reaction was detected. The presented method appears promising, but false positives cannot be completely ruled out. This aspect should be further investigated in future work.

 

More information

Main author

Robin Langer

Co-Authors

Michael Eckert, Klaus Böhm, Daniel Paula, Hans-Georg Schweiger

Type of media

PDF

Publication type

Lecture

Publication year

2023

Publisher

EVU

Citation

-