Embedded Hardware and Software Design of Omnidirectional Autonomous Robotic Platform Suitable for Advanced Driver Assistance Systems Testing with Focus on Modularity and Safety

This paper deals with the problem of using Autonomous Robotic Platforms (ARP) for the ADAS (Advanced Driver Assistance Systems) testing in automotive. There are different possibilities of the testing already in development and lately, the ARP are beginning to be used more and more widely. ARP discussed in this paper explores the hardware and software design possibilities related to the field of embedded systems. The paper focuses in its chapters on the introduction of the problem in general, then it describes the proposed prototype concept and its principles from the embedded HW and SW point of view. It talks about the key features that can be used for the innovation of these platforms (e.g., modularity, omnidirectional movement, common and non-traditional sensors used for localization, synchronization of more platforms and cars together or safety mechanisms). In the end, the future possible development of the project is discussed as well.

Eye-Gesture Analysis for Driver Hazard Awareness

Because road traffic accidents are a major source of death worldwide, attempts have been made to create Advanced Driver Assistance Systems (ADAS) able to detect vehicle, driver and environmental conditions that are cues for possible potential accidents. This paper presents continued work on a novel Nonintrusive Intelligent Driver Assistance and Safety System (Ni-DASS) for assessing driver attention and hazard awareness. It uses two onboard CCD cameras – one observing the road and the other observing the driver-s face. The windscreen is divided into cells and analysis of the driver-s eye-gaze patterns allows Ni-DASS to determine the windscreen cell the driver is focusing on using eye-gesture templates. Intersecting the driver-s field of view through the observed windscreen cell with subsections of the camera-s field of view containing a potential hazard allows Ni-DASS to estimate the probability that the driver has actually observed the hazard. Results have shown that the proposed technique is an accurate enough measure of driver observation to be useful in ADAS systems.

Eye Gesture Analysis with Head Movement for Advanced Driver Assistance Systems

Road traffic accidents are a major cause of death worldwide. In an attempt to reduce accidents, some research efforts have focused on creating Advanced Driver Assistance Systems (ADAS) able to detect vehicle, driver and environmental conditions and to use this information to identify cues for potential accidents. This paper presents continued work on a novel Non-intrusive Intelligent Driver Assistance and Safety System (Ni-DASS) for assessing driver point of regard within vehicles. It uses an on-board CCD camera to observe the driver-s face. A template matching approach is used to compare the driver-s eye-gaze pattern with a set of eye-gesture templates of the driver looking at different focal points within the vehicle. The windscreen is divided into cells and comparison of the driver-s eye-gaze pattern with templates of a driver-s eyes looking at each cell is used to determine the driver-s point of regard on the windscreen. Results indicate that the proposed technique could be useful in situations where low resolution estimates of driver point of regard are adequate. For instance, To allow ADAS systems to alert the driver if he/she has positively failed to observe a hazard.