 The conversations we share with our colleagues, friends and family can be considered to take place through two channels. There's the verbal channel through which we talk and listen, and then there's the non-verbal channel, through which we make and interpret hand movements to supplement our speech, through which we talk with our hands. One question that has long-interested scientists is what happens when the verbal channel is accidentally impaired, as it is in patients with aphasia, a language disorder most often caused by stroke? Do patients rely more heavily on the non-verbal channel to compensate for the loss, or do they try to improve their verbal comprehension in other ways? In a bid to answer these questions, researchers used a remote eye-tracking device by sensor-motoric instruments and behavioral analysis software by Noldus. By tracking the eye movements of aphasic patients and healthy participants while they watched videos of conversations, the researchers observed that aphasic patients showed a similar reaction to hand movements or co-speech gestures as healthy participants. In other words, they didn't compensate for their impaired verbal channel by focusing more on a speaker's hands or gestures. They did, however, spend less time watching the speaker's face. This may have helped them better comprehend the auditory speech signal by preventing it from getting mixed up with visual signals, in this case, articulatory face movements. The researchers first wanted to confirm that co-speech gestures influenced the direction of attention of healthy observers as they watched others conversing. The researchers thus used a remote eye tracker from SMI to capture the eye movements of healthy subjects while they watched videos of two people conversing. By matching the eye movement data with events in the conversation, such as gestures and changes in speaker identity, the researchers showed that co-speech gestures do in fact draw attention toward the gesturing actor. Gestures cause the subject to look more at the speaker's hands and less at the listener's face. Because aphasia impairs how spoken language is understood, the researchers hypothesized that aphasic patients might process co-speech gestures differently as a way of compensating for this deficit. When they analyzed the eye movements of stroke patients with aphasia as they watched the same videos of conversations, the researchers observed that the aphasic patients spent less time looking at the speaker's face than the healthy subjects. Co-speech gestures, however, also caused the aphasic patients to look more at the speaker's hands and less at the listener's face. These results suggest that aphasic patients do not process gestures differently to compensate for deficits in language comprehension. Instead, they may avoid looking at a speaker's face to reduce the burden of integrating visual and auditory signals. Future studies may explore other ways to cope with the verbal comprehension deficits of patients with aphasia.