Igor Ivanecky, Elmer Atagu
supervised by Dr. Dimitri Androutsos
Driver fatigue is a serious issue affecting road safety. Studies done by Transport Canada have shown commercial drivers to have a "far greater risk of being involved in a fatigue-related crashâ" compared with non-commercial drivers. However, a recent article published by The Globe and Mail indicates that sleep deprivation is turning into an epidemic among the general North American population. Currently, the monitoring of fatigue is left up to the potentially fatigued driver, which is problematic and ineffective. Thus, there is a growing need to create a way to objectively monitor driver alertness. The aim of this project was to use a dashboard-mounted camera with infra-red LEDs to monitor the opening and closing of the driver's eyes. To reduce the cost of the implementation, a smartphone (something which many people in North America own) was used to handle all processing.
A D-Link Wi-Fi infra-red camera was chosen to be interfaced with an iPhone to allow our driver alertness monitoring system to work in any lighting condition, including total darkness. Work was done in order to get the video footage from the Wi-Fi camera to the iPhone. This was done by establishing a network connection from the iPhone to the WLAN which the camera is connected to. A communication protocol was then used to establish a connection from the iPhone application to the server hosted on the Wi-Fi camera. This was done by setting up a socket connection to the IP camera. After establishing the connection, the video data stream was analyzed to find the location of the image stream within it. The images were then captured frame by frame and exported to the rest of the program.
The extracted images were then used to find the PERCLOS, the percentage of eyelid closure over the pupil over time. PERCLOS is a widely used physiological marker of human alertness as it corresponds to slow eyelid closures ("droops") rather than blinks. To successfully monitor PERCLOS, digital image processing was carried out by means of the OpenCV library, whose functionality was ported into the iPhone. The eye region of the driver is identified using smart algorithms which rely on natural blinking behavior in the video sequence to identify the eyes. Once identified, the eye region can then continually tracked and monitored for closure using cross-correlation analysis. Sensing drowsiness in the driver, the system can respond by alerting the driver via loud sounds and vibrations.
All of the image retrieval and processing is able to run at 15fps in real-time on the iPhone 4. The entire system is self contained and does not require an internet connection to be present.
Project targeted applications: Commercial Transportation, Worker Safety