 This paper proposes a new method for detecting driver's attention states based on the improved version 5, YOLOV5, of the You Only Look Once algorithm. This algorithm uses facial features to determine if a driver is experiencing fatigue or engaging in distracting behavior. Fatigue drivers have their eyes and mouth closed, so the algorithm measures the aspect ratio of these features to determine if they are within certain thresholds. Distracted drivers exhibit abnormal behavior, such as smoking, drinking water, or using a mobile phone, so the algorithm uses three data sets to train and test the model. The results show that the map, mean average precision, has increased by 2.4%, indicating that the proposed method is effective at detecting both fatigue and distracted behavior. This article was authored by Zhong Zhuo Wang, Keminyao, and Fuogua.