Vehicle Interaction Lab

 Automated driving and vehicle assistance systems are taking on an increasing number of driving tasks. To explore how people are coming to terms with these increasing levels of automation, Fraunhofer IAO is conducting tests in its Vehicle Interaction Lab.

An important part of the research is dedicated to driving scenarios while the focus is shifting progressively toward automated driving. After all, the more autonomous the vehicle, the more time its driver has for other things. One of the topics researchers are examining is the extent of distraction and sleepiness for drivers as well as when their involvement is necessary and how to get their attention as quickly as possible. Other important aspects of the research are how the driving experience is changing emotionally, which elements of automation drivers like and feel comfortable with, and how the vehicle interior should be designed to this end. For this research the integration of technical components, such as smartphones and tablets or other hardware is easily done in the driving simulator at Fraunhofer IAO.

Equipment used in the driving simulation lab includes

  • an eye tracking system,
  • multi-camera surveillance as well as
  • devices for measuring physiological data and
  • evaluating driving data.
  • The lab is also equipped with an age simulation suit and
  • a pupillograph that measures driver fatigue.

Depending on the project requirements and focus of the research, different driving simulations and real vehicles are used with virtual prototypes and concepts for human-machine interfaces along with automated driving functions to create true-to-life driving scenarios in all phases of development. Using realistic and functional prototypes from the outset helps the researchers make the right decisions and identify the optimum design options from an early stage.

In the immersive driving simulator, the driver sits behind the wheel in a real vehicle with an angle of vision of 180°. In addition, three front projections simulate the view in the inner and outer rearview mirrors, giving the subject a perceived field of vision close to 360°. To further increase the sense of reality, a sound system spatially reproduces all the acoustic signals from the vehicle and the surrounding environment. The simulator is also equipped with a movement system that generates seat and chassis vibrations and simulates jerking motions caused by breaking.

Facilitating research into the human-machine interface, the simulator is fitted with a modular and expandable dashboard featuring a reconfigurable instrument display and a screen with multi-touch functionality in the central console. For conducting the simulations, SILAB software is used.

Further information