What can you use eye tracking for?

Eye tracking is a technique which records what you are looking at. With more complex analysis, it can give all sorts of information about a subject's mental state.

Posted by

Andrew Spink

Published on

Tue 20 Oct. 2020

What can you use eye tracking for?

Eye tracking is a technique which records what you are looking at. At its simplest, it records if someone looked at a given object or not. With more complex analysis, it can give all sorts of information about a subject's mental state and the tasks they are carrying out.

When you are reading a page of text, you don't see all the words in focus at once, just the one you are reading at that moment. As your eye skips from word to word, your brain builds up a picture of the total scene and as soon as we pay attention to particular area, it springs into sharp focus. That area of focus corresponds to a small point on the retina of your eye called the fovea (about 2° of vision), and eye-tracking equipment uses that to determine what you are looking at. 

Where we stop and pay attention to a word, that is called a fixation and where we skip, that is a saccade. During a saccade, no visual information is transmitted to the brain. Fixations and saccades don't just occur during reading, they happen all the time.  

Two sorts of eye trackers

There are essentially two sorts of eye trackers; head-mounted and remote. Remote eye trackers are often mounted under a computer monitor so that the system can work out where you are looking on the screen (when combined with screen capture) but they can also be in the form of a number of cameras placed in the scene. Multiple eye tracking cameras are normally used when the researcher is interested in 3D situations such as someone sitting in a car simulator. 

These days, head-mounted eye trackers are often in the form of eye tracking glasses. In that case, instead of the scene that the subjects are looking at being fixed in relation to the camera, it is moving around. So eye tracking glasses record a video of the scene in front of the glasses and combine that with the point of gaze of the subjects.

 In order to know what it is that the subjects are looking at, the researchers normally have to manually label the video, which is a lot of work seeing there is a new image 25 or 30 times a second. However, new techniques are being developed which use computer vision and machine learning to automate that. 

There is also a third technique which uses electrodes attached to the skin next to the eye (electrooculogram, EOG). As it is more invasive, it is used less often these days. The advantage is that it can measure eye movements when the eyelids are closed, for example to detect the REM phase of sleep.








FREE WHITE PAPER: Eye Tracking & FaceReader

Download this white paper to learn about:

  • subconscious processes and emotion recognition
  • combining FaceReader™ and eye tracking
  • visualizing and analyzing your data

What can you use eye tracking for?

Reading

Eye tracking has been used a lot to investigate reading. If someone lingers on a word that they are reading, most likely they are having to think about it more. That can be used to puzzle out the answer to all sorts of questions. For example, in studies of bilingualism, researchers want to know if the vocabularies of the two languages are stored separately in the brain. Normally bilinguals read the less dominant second language slightly slower. However, for cognates (words that are similar in both languages), that is less so - proving that there is a 'cross lingual overlap'. [1]. 

Dyslexia

Dyslexia can be hard to diagnose using traditional methods, with accuracies typically of 70-80%, increasing to 90% with repeated tests. The traditional tests are also better at excluding non-dyslexic children than they are in correctly identifying dyslexic readers. If eye tracking is used, this can be increased to 95% after a single quick test [2]. The exact features used are complex, but in general children with dyslexia fixate longer on individual words and the saccades are shorter, which indicates that greater mental effort is needed to decode the individual words. 

Human factors research

Eye tracking is also used a lot in human factors studies. That is the study of the interaction of people driving cars, piloting planes and so on. Cognitive psychologists working in that area are often interested in the mental workload of operators of vehicles and other machinery. If the tasks is too demanding then mistakes can be made, possibly leading to accidents. But also if it is too easy, then the work is boring and that can also lead to errors. In those studies, researchers don't look only at the fixations and saccades, but more at other data that the eye trackers acquire. 

For example, there is a lot of evidence that the higher the mental workload, the higher the blink rate [3] and the greater the speed with which the pupil diameter changes [3]. Dwell time is closely rated to fixation duration in that it measures how long someone looks at a particular relevant area (not just one point) such as an instrument dial. As you would expect, the high the mental workload, the longer someone needs to look at an instrument to extract the necessary information from it. The pattern of looking at different instruments (and at other things, like out the window) also changes, becoming less random with increasing workload (this is called visual entropy) [4].  

Visual tunnelling is also a well-known phenomenon - under conditions of high mental workload, the attention narrows. For example if you are driving and make a phone call, even when hands-free, you give less attention to what is happening on the road [5]. 

Another parameter that some eye trackers can measure is how much the eye lid is closed. Unsurprisingly, the percentage of time that the eye lid is 80% closed can be used as an indicator of fatigue [3]. This sort of application has potential for use in actual cars to detect driver drowsiness as well as just in research. 

Design

Where a driver or operator looks is also revealing to designers of cars, cockpits and so on. For instance, if you ask if they have looked at a particular indicator dial then you get an unreliable answer because a lot of monitoring actions are carried out automatically without our consciousness being aware of them (and asking also interrupts the task), but eye tracking will give more reliable data [6]. The actual scan path of an operator can be compared to what the designer had in mind which will (hopefully) be the same as how the operator was trained [7]. 

It is not only in designing complex cockpits and similar setups that eye tracking is used. It is also used in designing all sorts of other products from websites to advertising and computer games and understanding how consumers use those and experience products (neuromarketing). For example, a recent study [8] used eye tracking as well as other techniques to help understand why Facebook can be so engaging. 

Designers typically use heat maps visualising how much time subjects have looked at various elements in order to work out how much attention they get. That will frequently be in done by comparing two different designs, for example placing banners differently on a web page. Designing the layout of supermarket shelves can be improved with the same technique.

Read more: Shopping behavior - what can we measure in a retail environment?

tobii heatmap of eye tracking in a supermarket

Communication

Another important field of study which uses eye tracking, is that of human communication. When you are talking to someone, it is natural to look them in the eye. However, if that is more frequent than normal, it can come across as intimidating and if less frequent it can be an indication of autism (ASD). It is not just the frequency, but also the synchrony that is indicative. If someone looks at you, it is normal to look back. Unlike the measures used in human factors studies, the way we make eye contact during communication is very heavily influenced by our culture and varies across different societies [9]. 

Noldus has developed prototype software which even makes those measurements possible using normal cameras [10]. Eye trackers can also be used to enable communication with patients who cannot speak. The patient looks at a special keyboard and the computer then uses text-to-speech software to enable them to talk. 

Eye

Read more: 3 Examples of eye tracking lab set-ups

Other species

Eye tracking isn't just used on humans. The systems have been adapted for use on a whole variety of different animal species including mice, fish, cats and dogs. For instance, cats show a higher precision in their eye stabilization eye movements when looking at a natural scene than at 'drifting gratings'  whereas humans show a similar precision for both [11], which is probably related to the cats' hunting abilities. 

What eye tracking doesn't do

It is also important to remember that eye tracking, like all measurement techniques, has its limitations. It measures where the eye focusses, but says nothing about peripheral vision. 

In a lot of tasks like driving, we make use of a glimpse of something moving at the edge of our field of vision and don't need to focus on the object to know to avoid driving into it. For example, a study showed that if pilots were presented airspeed information only in their peripheral vision, they did make use of it [12]. That also has implications for other studies like design. Just because the person using your website didn't focus directly on the logo doesn't mean to say that they didn't notice it.

Some of the eye tracking measures are also influenced by other factors. Most obviously, pupil diameter (and to a lesser extent blink rate) are changed by changing light levels. That's fine in a situation where that is all under control like a driving simulator, but out in a real car the lighting changes all the time. Stress levels can also interfere with the parameters used to measure mental workload, making it difficult to distinguish the two. 

New techniques in eye tracking

Eye tracking is a technique which has been around for over 100 years, but it is still developing. The hardware is getting better (more compact, cheaper, easier to use).  Eye tracking data can be combined with other data, such as the subjects' emotions to give deeper insights (see: https://www.noldus.com/cube) or EEG to see if a subject is just looking at something or if they cognitively process the visual information [13]. 

Noldus is participating in some research projects to innovate in this area. Soon it will be possible to carry out basic eye tracking with a normal webcam (see: https://www.noldus.com/projects/eyereader). The measurement of mental states like fatigue and mental workload will also be better in the future (see: https://www.noldus.com/projects/mind-map). Noldus is also able to offer expert advice to its customers to provide them with the best eye tracking solution for their needs.









PROJECT PARTNER

If you are setting up a research project and think that Noldus could be a partner in your consortium, let us know!

  • Develop solutions in consortia
  • Co-develop
  • Trust our 30 years of experience

References

1. Van Assche, E., Drieghe, D., Duyck, W., Welvaert, M. and Hartsuiker, R.J., 2011. The influence of semantic constraints on bilingual word recognition during sentence reading. Journal of Memory and Language, 64(1), pp.88-107.

2. Nilsson BenfattoM, Öqvist Seimyr G, Ygge J, Pansell T, Rydberg A, Jacobson C (2016). Screening for Dyslexia Using Eye Tracking during Reading. PLoS ONE 11(12):e0165508. doi:10.1371/journal.pone.0165508

3. Gerhard Marquart, Christopher Cabrall, Joost de Winter (2015) Review of Eye-related Measures of Drivers’ Mental Workload. Procedia Manufacturing 3. DO  - 10.1016/j.promfg.2015.07.783

4. Tole, J.R., Stephens, A.T., Vivaudou, M., Harris Jr, R.L. and Ephrath, A.R. (1982). Entropy, instrument scan and pilot workload. NASA. 

5.  Strayer DL, Drews FA, Johnston WA. Cell phone-induced failures of visual attention during simulated driving. J Exp Psychol Appl 2003; 9(1):23-32.

6. J. C. F. de Winter · Y. B. Eisma · C. D. D. Cabrall · P. A. Hancock · N. A. Stanton, (2019). Situation awareness based on eye movements in relation to the task environment. Journal of Cognition, Technology and Work  99: 1-11. DOI: 10.1007/s10111-018-0527-6

7. Eye tracking metrics for workload estimation in flight deck operations. Kyle Kent Edward Ellis 2009 MSc thesis University of Iowa.

8. Maurizio Mauri, Pietro Cipresso, Anna Balgera, Marco Villamira, and Giuseppe Riva. Cyberpsychology, Behavior, and Social Networking. Dec 2011. 723-731.http://doi.org/10.1089/cyber.2010.0377

9. Blais C, Jack RE, Scheepers C, Fiset D, Caldara R (2008) Culture Shapes How We Look at Faces. PLoS ONE 3(8): e3022. https://doi.org/10.1371/journal.pone.0003022

10. Palmero, C., van Dam, E.A., Escalera, S., Kelia, M., Lichtert, G.F., Noldus, L.P., Spink, A.J. and van Wieringen, A., 2018. Automatic mutual gaze detection in face-to-face dyadic interaction videos. In Proceedings of Measuring Behavior 2018 Pp 158-163. measuringbehavior.org.

11. Moeller, G.U., Kayser, C., Knecht, F. et al. Interactions between eye movement systems in cats and humans. Exp Brain Res 157, 215–224 (2004). https://doi.org/10.1007/s00221-004-1835-z

12. Schaudt WA, Caufield KJ, Dyre BP. Effects of a Virtual Air Speed Error Indicator on Guidance Accuracy and Eye Movement Control during Simulated Flight. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2002;46(17):1594-1598. doi:10.1177/154193120204601714

13. A.-M. Brouwer, I. Stuldreher, S. Huertas Penen, K. Lingelbach and M. Vukelić (2020). Combining eye tracking and physiology for detection of emotion and workload. In: Spink, Andrew, Barski Jarosław, Brower, Anne-Marie, Riedel Gernot, Sil Annesha (2020). Volume 1 of the Proceedings of the 12th International Conference on Measuring Behavior to be held in Krakow, Poland, October 15-18 2021. Pages 2-11. measuringbehavior.org. doi:10.6084/m9.figshare.13013717

Related Posts

4 recent blog posts about understanding autism spectrum disorder
13 Apr human behavior research Psychology

4 recent blog posts about understanding autism spectrum disorder

Our understanding of autism spectrum disorder (ASD) has developed a great deal over recent years, but there is still much work to be done.
Parent-child interaction in autism: play behavior
29 Oct human behavior research Psychology

Parent-child interaction in autism: play behavior

Stephanny Freeman and Connie Kasari observed play behavior in an observation lab. They invited parents with their children to their observation lab and coded behavior in great detail.
Vocalizations as an early life behavioral marker for ASD
30 Aug human behavior research Psychology

Vocalizations as an early life behavioral marker for ASD

One of the prominent features of autism spectrum disorder (ASD) is impaired communication. However, most diagnoses of ASD happen after the child is more than 3 years old.