Files
DissLiteratur/storage/N7ABZ7DG/.zotero-ft-cache
Johannes Paehr c4354c0441 init
2025-10-18 15:35:31 +02:00

190 lines
15 KiB
Plaintext
Raw Permalink Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
Proceedings of the 6th World Congress on Intelligent Control and Automation, June 21 - 23, 2006, Dalian, China
Driver Fatigue Detection Based on Eye Tracking
Ling Gan, Bing Cui and Weixing Wang
Department of Computer Science and Technology Chongqing University of Posts and Telecommunications
Chongqing 400065, China
{ganling & wangwx}@cqupt.edu.cn
Abstract - Driver fatigue is a key factor of accident. To solve the problem, eye tracking method is used to alarm tired drivers. To finalize this procedure, we took driver face images first by using a digital camera, then the images are processed and segmented. Then the horizontal projections of the detected objects in the binary images are analyzed. In this way, eye locations are obtained based on the horizontal projection histogram (e.g. at high peak of projection charts). And horizontal boundaries of eyes are located at the valley points. To determine the valley points, an improved slope algorithm is presented in this paper. We use first order linear difference with five steps to calculate curve slope. As a result, boundaries of eyes can be obtained. And eye pixels can be counted in the eye region. Spirit status of drivers can be detected by analyzing the eye pixel numbers.
Index Terms - image processing, horizontal projection, curve slope, fatigue driving.
I. INTRODUCTION
Driver fatigue is the key factor of heavy traffic accidents in the world. According to statistics of national highway traffic safety administration, there are about 100 000 accidents caused by drowsy drivers and result in more than 1500 fatalities and 71000 injuries every year in United States [1]. Statistics shows that 7% of crashes of and 40% of fatal accidents are due to driver fatigue. The same things happen in Europe. Germany insurance agent association estimates that 25% of fatalities are owing to driver fatigue. For this reason, many states pay more and more attention to develop driver fatigue detection and alert system.
II. SURVEY OF DRIVER FATIGUE PREVENTION RESEARCH
There are many ways to detect driver fatigue, such as eyeball movement ǃ animal heat ǃ brain wave ǃ heartbeat. Professor Jones in New Zealand institute of Parkinsons disease and brain invented a monitoring alert to monitor brain wave and eye movement in the process of driving. This instrument can test whether drivers are tired or not. And it can alarm drivers automatically if they are dozing off. Wang Ronben presents that mouth shape changes if drivers spirit status changes. So mouth shape is used to detect driver fatigue [2]. Zheng pei, Song zhenghe analyzed eye closing time. The ratio between eye closing time and a certain time can reflect fatigue status of drivers. If the ratio is much larger, the driver is much more tired [3]. Joel C. McCall and Mohan M. Trivedi design a system for driver attention monitoring. There are a wide variety of sensors fixed in the system to obtain information about the driver and his surroundings. Then they
presented a algorithm to combine these information together to form a reliable estimate of driver attention [4]. The distance of eyelid is used to judge whether the driver is drowsy by Wenhui Dong. She proposed that the distance of the eyelid is large when the eye is open and the distance is nearly zero when the eye is closed [5]. The Chaos theory was also applied to explain the change of steering wheel motion [6].
There are also lots of companies have developed some systems to resolve the problem.
An Australian company called AISMCARP has developed a monitoring system fixed on the fascia. The system can judge whether the driver pays attention to road or not by sight-tracking technology. And it can alarm the driver when he or she is dozing off. The system keeps on scanning the drivers face by tow cameras. Based on the obtained information, the position of eyes and status of eyeballs can be ascertained. Then the system analyzes the line of sight by comparing eyeball shapes with computer model. If the driver is dozing off, he or she will blink in a particular way. As a result, spirit status of the driver can be known. A driver alarming system was invented in VOLVO vehicle manufacture to prevent driver from dangerous driving. The system monitors running process of an auto. So it can reduce false alarms caused by differences of different people.
III. ALGORITHM OF DETECTING FATIGUE DRIVING
We present an algorithm of detecting fatigue driving based on analyzing eyes status. Firstly, face images of a driver are obtained and segmented. Then ascertain the eyes position by horizontal projection charts of segmented images. So area of eyes can be calculated. As a result, spirit status of the driver can be analyzed by the area.
ʳ
A. Capture and Segmentation of Driver Face Sequential Images
When drivers are dozing off, their eyes close slowly. Thus, we can take photos every t seconds. And n sequential images can reflect the dozing process.
Face is insignificant for analyzing drivers spirit status even though it is the biggest part of an image. Thus, eyesǃ hairǃnoseǃears should be segmented from the image. B. Horizontal Projection of Binary Images
Scan these binary images every line. Count pixels whose gray values are 0, horizontal projection charts of the images can be obtained. As shown in Fig.1 and Fig.2, eyes projection is located at large wave crest of the projection chart.
1-4244-0332-4/06/$20.00 ©2006 IEEE
5341
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 11,2024 at 09:56:01 UTC from IEEE Xplore. Restrictions apply.
Fig.1 Original binary image
Fig.2 Horizontal projection chart
C. Computation of Eye Position Based on Binary Images Horizontal Projection
There are many methods to ascertain eye position. Qiang Ji, Zhiwei Zhu, and Peilin Lan used the IR illumination to obtain drivers image. And they found that pupils in the even images look significantly brighter than in the odd images. The odd image was subtracted from the even image. Then the pupils can be distinguished [1]. Wen-Bing Horng, Chih-Yuan Chen used characteristic of skin colors and dynamic templates for eye detection [7]. Since eyes have some certain shape, WANG Rong-ben, GUO Ke-you used geometry restriction method to ascertain eye coordinates [8]. Haisong Gu etc. used some sensors to robustly detect pupils and the head motion firstly. Then Gabor wavelet was used for fast feature detecting. The reliability propagation based on spatial relationships to handle occlusion refined the tracking results [9]. Wang Rongben etc. put forward that Gabor wavelet was used to extract texture feature of eyes. And Neural Network classifier was used to analyze fatigue behavior [10].
These methods can obtain more accurate position of eyes. But they need expensive instruments or take more time for their complicated algorithms. A simple and timesaving method is presented as follows.
As shown in former section, eyes projection is located at large wave crest of the projection chart. And eyes horizontal boundaries are located at the troughs. So positions of eyes horizontal boundaries can be calculated by slope of the projection curve. If slope of a horizontal projection point is larger than a threshold, the point can be taken as a trough
candidate. But regarding the candidate as a trough just based on normal slope will result in that some points whose slopes are larger than the threshold while wave crests are very small are regarded as troughs falsely. To avoid this problem, we present a new method to compute slope of a pixel. Assume that vertical coordinate of a pixel is x and its horizontal projection value is f(x), its slope can be computed like this: f(x+5)-f(x). Compute slope of every horizontal projection point. If slope of a horizontal projection point is larger than a threshold (threshold1) firstly, the point will be regarded as left trough point. The left trough point of horizontal projection chart indicates the eyes bottom boundary of the binary image. The process of analyzing the projection points one by one wont stop until that slope of a projection point is less than a threshold (threshold2) firstly. This point will be regarded as the wave crest point. And the wave crest point of projection chart indicates eyes middle part of the binary image. Keep on analyzing. If slope of a projection point is larger than a threshold (threshold3) firstly, the point will be regarded as right trough point. And it indicates top boundary of eyes.
In order to ascertain the vertical boundaries, scan every row in the horizontal region of eyes. Hair part and eye part can be found out after scanning. Then choose a horizontal coordinate randomly from the middle part of two regions as the vertical boundary. Finally, eyes boundaries can be ascertained.
D. Closing Information Obtained from the Eye Region of Binary Image
As shown in Fig.3 and Fig.4, when eyes are open, eye region contains eyeballs and eyelashes. Number (n) of pixels whose gray values are 0 will be very large. When eyes are closed, eye region just contains eyelashes, n will be very little. Thus, closing status of eyes can be ascertained by n.
Fig.3 Binary image open eyes Fig.4 Binary image of closed eyes
E. Spirit Status Analysis of Drivers Based on Eye Closing Information
If n keeps on becoming less and less smoothly in a period, it means that the driver is dozing off. If n becomes less sharply, it may mean that the driver is blinking.
IV. APPLICATION
A. Obtain Sequential Images of Drivers Face. In This Experiment, 5 Images Are Taken in 3 seconds. The Images Are Shown as Follows:
5342
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 11,2024 at 09:56:01 UTC from IEEE Xplore. Restrictions apply.
Fig.5 Original image 1 Fig.6 Original image 2 Fig.7 Original image 3
Fig.8 Original image 4 Fig.9 Original image 5
B. Segment EyesǃEyebrowsǃHair etc. from Original Images by Threshold Algorithm
Fig.10 Segmented image Fig.11 Segmented image Fig.12 Segmented image of original image 1 of original image 2 of original image 3
Fig.16 Horizontal projection chart of image 11
Fig.13 Segmented image Fig.14 Segmented image of original image 4 of original image 5
C. Project Binary Images. Horizontal Projection Charts Are Shown as Follows:
Fig.17 Horizontal projection chart of image 12
Fig.15 Horizontal projection chart of image 10
Fig.18 Horizontal projection chart of image 13
5343
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 11,2024 at 09:56:01 UTC from IEEE Xplore. Restrictions apply.
Fig.19 Horizontal projection chart of image 14
D. Compute Boundaries of Eyes Based on Horizontal Projection. We Evaluate Threshold1 30, Threshold2 -30, Threshold3 10 after Several Experiments.
Fig.20 Marked eye Fig.21 Marked eye Fig.22 marked eye region of image 10 region of image 11 region of image 12
Fig.23 marked eye Fig.24 marked eye region of image 13 region of image 14
E. Count Number of Pixels whose Gray Values Are 0 in Eyes Region.
The algorithm is achieved using VC 6.0 on a Windows XP platform. Pixel number n and boundary coordinates of eye regions are shown in TABLE 1.
TABLE 1 EYE PARAMETERS OBTAINED BY IMAGE PROCESSING
Image Eye pixel
Top
Bottom
Left
Right
number ΰpixelα
boundary of eye
ΰpixelα
boundary of eye
ΰpixelα
boundary of eye
ΰpixelα
boundary of eye
ΰpixelα
1
2761
307
357
184
673
2
2296
286
345
216
714
3
1920
269
325
215
716
4
1656
309
379
184
683
5
1060
309
371
190
683
After several experiments, we found that if 200=<n(i)n(i+1)<=1000,1<=i<=4, the driver is dozing off. In this experiment, n(1)-n(2)=465;n(2)-n(3)=376; n(3)-n(4)=264; n(4)-n(5)=596. So, the driver is dozing off and needs to be alarmed.
In this paper, the improved method of computing projection curve slope is presented, to determine the horizontal boundaries of eyes. The pixel numbers of eyes are analyzed to reflect spirit status of drivers.
Eyes are located at a high peak of horizontal projection chart. And horizontal boundaries of eyes are located at the valley points. In order to decide the valleys, first order linear difference with 5 steps is used to calculate curve slope. This method can not only determine valley points but also avoid regarding points of little waves as troughs. As a result, eye region can be obtained. And pixel number of eyes can be counted. In a certain period, spirit status of drivers can be detected by analyzing the numbers of the eye pixels.
We conduct several experiments. The results show that the algorithm performs well on time and accurately.
Anyway, this study result is promising. We need to do more work in near future, because the auto-detection of eye flickering and status of drivers are not easy and simple.
REFERENCES
[1] Qiang Ji, Zhiwei Zhu and Peilin Lan, “Real-Time Nonintrusive Monitoring and Prediction of Driver Fatigue,” IEEE Transaction on vehicular technology, vol. 53, no. 4, pp.657-662, July 2004.
[2] Wang Rongben, Guo Lie and Tong Bingliang, “Monitoring mouth movement for driver fatigue or distraction with one camera,” IEEE Intelligent Transportation Systems Conference, Washington, D.C., USA, October 3-6, pp.314-319, 2004.
[3] Zheng Pei,Song Zhenghe and Zhou Yiming, “Perclos-based recognition algorithms of motor driver fatigue,” Journal of China Agricultural University, pp. 104-109, July 2004.
[4] Joel C. McCall and Mohan M. Trivedi, “Visual context capture and analysis for driver attention monitoring,” 7th IEEE Conference on Intelligent Transportation Systems, October 2004.
[5] Wenhui Dong and Xiaojuan Wu, “Dirver fatigue detection based on the distance of eyelid,” IEEE Int. Workshop ULSI Design & Video Tech, Suzhou, China, May 28-30, pp.365-368, 2005.
[6] Yoshihiro Takei and Yoshimi Furukawa, “Estimate of drivers fatigue through steering motion”, IEEE International Conference on Systems, Man and Cybernetics, vol.2, pp.1765-1770, October 10-12, 2005.
[7] Wen-Bing Horng, Chih-Yuan Chen and Yi Chang B, “Driver fatigue detection based on eye tracking and dynamic template matching,” Proceedings of the 2004 IEEE International Conference on Networking, Sensing & Control, Taipei, Taiwan, vol.1, no. 21, pp.7-12, March 2123,2004.
[8] Wang Rong-ben, Guo Ke-you and Chu Jiang-wei, “Study on the eye location method in driver fatigue state surveillance,” Journal of highway and transportation research and development, vol. 20, no. 5, pp.111-114, October 2003.
[9] Haisong Gu, Qiang Ji and Zhiwei Zhu, “Active facial tracking for fatigue detection,” The Sixth IEEE Wrokshop on Application of Computer Vision, pp.137-142, December, 2002.
[10]Wang Rong-ben, Guo Ke-you and Shi Shu-ming, “A monitoring method of driver fatigue behavior based on machine vision,” IEEE conference on Intelligent Vehicles Symposium, pp.110-113, June 2003.
CONCLUSION
5344
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 11,2024 at 09:56:01 UTC from IEEE Xplore. Restrictions apply.