Gaze tracking - what is it and how does it work?

Gaze tracking monitors eye movement, helping you to identify which parts of your design attract the most user attention. It's incredibly useful in research and business, as it provides insight into what users see and how they interact with provided content.

Gaze tracking

What is gaze tracking?

Gaze tracking is a method to monitor eye activity, including eye movements, point of gaze, pupil dilation, and blinking. It relies on eye tracking devices, which use sensors and algorithms to follow the eyes' movement and determine where a person is looking. The gathered data can be analyzed to understand how a person responds to visual information, such as a website or a product advertisement. This analysis includes identifying what they focus on, overlook, or revisit.

Eye gaze tracking helps professionals like marketers, advertisers customer experience designers and product developers understand how customers react to things like store layouts and ads. This can make customers more engaged.

This technology is useful in many areas, such as studying human behavior, testing user interfaces, and helping people with disabilities. It's also used in marketing research to see how people look at ads and other visuals.

There are several ways to track eye gaze. Infrared eye tracking uses infrared light to see eye movements, while video-based eye tracking uses a camera. These can be paired with other sensors and algorithms to get a detailed picture of where a person is looking.

Key Eye Gaze Tracking Metrics and Terms

Fixations and Gaze Points

Fixations and gaze points are key terms in eye tracking, often being the most commonly used terms.

A fixation refers to a period of time when a person's gaze is concentrated on a particular point in their environment. On the other hand, a gaze point is a specific location in the environment where a person is looking. Eye gaze tracking systems generally measure and analyze fixations and gaze points to understand where and for how long a person is looking.

Gaze points show what the eyes are looking at. For example, with a 60 Hz eye tracker, it records 60 gaze points every second.

A group of gaze points close in time or place is called a fixation, showing when the eyes are focused on something. Fixations are good signs of where a person's attention is, and research in this field is growing.

Eye movements between fixations are called saccades. When reading, our eyes don't move smoothly. We fixate our eyes on every 7-9 letters, depending on the font. The "visual span" is the number of words we can read around the word we're focused on. People who read a lot can cover more text with fewer fixations.

When watching a distant car, our eye movements are different. We follow the car smoothly, without saccadic movement. But if the car moves too fast or unpredictably, catch-up saccades may happen.

The number of fixations or gaze points on a certain part of an image shows that area got more attention. Figuring out why this happens can be hard, but it helps us understand what parts of a scene draw and keep attention.

Heatmaps

A heatmap is a visual representation of eye gaze data that shows where people to look in an image or video. The map usually shows different colors, with the most-visited areas in the warmest colors and the least-visited in the coolest colors.

Using a heatmap is a simple way to instantly visualize which elements attract more attention. Heatmaps can be compared across individual respondents or groups, which is useful for understanding how different people might see something in different ways.aph

Areas of Interest (AOI)

An Area of Interest (AOI) refers to a specific region within an image or video that grabs the viewer's attention. AOIs can be manually or automatically defined by an eye gaze tracking system, typically used to analyze the amount of time a viewer dedicates to different areas within an image or video. While it's not a metric on its own, it determines the area for other metric calculations.

For instance, if a picture of a person is shown, separate AOIs can be drawn around the body and the face. This allows for distinct metrics for each region, such as the time elapsed from stimulus onset until participants looked at the area, the time respondents spent in the area, the number of fixations, and the number of instances people looked away and back

Such metrics are useful when comparing the performance of two or more areas within the same video, picture, website, or program interface.

Fixation Sequences

A fixation sequence is a path your eyes take when looking at a picture or video. It can show where you looked and when, helping us understand what parts of the image people care about most.

Fixation sequences are a roadmap of a person's attention. They usually start in the middle of the image because people often look there first. The rest of the sequence shows what the person found interesting.

Bright colors, unique shapes, and other standout elements can attract a person's attention. This is often used in eye tracking research to see what people find interesting.

The last thing you look at can often show what you're going to choose, especially in decisions about money. But, this depends on how you're used to looking at things. For example, most people read from the top left to the bottom right.

If the elements of an image are moved around a lot, the first thing you look at may not be a good guess of what you'll do next. But knowing how people look at things can be used to guide their attention in certain ways.

Time to First Fixation

Time to First Fixation (TTFF) is how long it takes for a person, or a group of people, to focus on a specific area after seeing a certain thing.

TTFF can show two types of searches: bottom-up, which is led by the thing itself (like a bright company logo), and top-down, which is led by the person's own attention (like when they choose to focus on certain parts of a website or picture). TTFF is a basic but helpful measurement in eye tracking, as it can help us understand what parts of a picture people look at first.

Time spent (Dwell Time)

Dwell time, or the time spent, shows how long people look at a specific Area of Interest (AOI). Sometimes, if people spend a lot of time looking at one part of an image, it might mean they are really interested in it and aren't getting distracted by other things.

If people spend a lot of time looking at one area, it can mean they find that area really interesting. But if they don't spend much time, it might mean they find other parts of the screen or room more interesting. But just looking at where people's eyes go doesn't tell us how they feel about what they're seeing. To understand that, we might need to look at their facial expressions or use an Electroencephalogram (EEG) to look at their brain activity.

Ratio

The ratio metric gives information how many of your respondents actually directed their gaze towards a specific Area of Interest (AOI). In market research, this is crucial for optimizing an advertisement to attract more viewers to certain areas, such as the logo or key product information.

This metric identifies which areas of an image attract the most or least attention, including those that received no attention at all. Gathering data about the gaze ratio across different groups can show which parts of the image appealed more to various participants.

Types of Eye Trackers

Eye trackers employ various technologies to measure and analyze eye movements. Below is a brief overview of three common types of eye trackers:

Remote Eye Trackers

Remote eye trackers are typically stationary, situated on a desk or stand, and they monitor eye movements relative to the surrounding environment. These devices often use sensors and cameras to measure eye movements with great accuracy, frequently outperforming head-mounted trackers.

Webcam Eye Trackers

Webcam eye trackers elevated the game to the whole new level by using webcams built-in in computers and smartphones as data collectors. Running event complex study became available in just a few clicks. Although they may be slightly less accurate than dedicated hardware, their ease of use and cost-effectiveness make them more accessible for researchers and businesses.

Head-Mounted Eye Trackers

Head-mounted eye trackers, worn in front of the eyes, track eye movements in relation to the head. These devices are generally more portable and user-friendly than remote eye trackers. However, their accuracy may be compromised due to head movements and the closeness of the sensors to the eyes.

Electrooculography (EOG) Eye Trackers

EOG eye trackers measure the electric potentials produced by eye muscles using electrodes. They are relatively affordable and user-friendly, but their accuracy may be lower compared to other eye tracking types. They are generally used in applications where high accuracy is not critically required.

Summary

Gaze tracking involves monitoring eye activity, such as eye movements, point of gaze, pupil dilation, and blinking, to understand how a person responds to visual information. The technology employs devices like infrared and video-based eye trackers. Key metrics in gaze tracking include fixations, gaze points, heatmaps, Areas of Interest (AOIs), fixation sequences, Time to First Fixation (TTFF), dwell time, and ratio. These metrics provide insight into where, how long, and why a person is looking at a particular point. Gaze tracking is used in various fields including marketing, advertising, customer experience design, product development, and aiding people with disabilities. This emphasizes the importance of asking, "How could this be useful for my business or project?" especially when significant investments are at stake.

"It’s been a pleasure working with RealEye. Their customer service is prompt, valuable, and always friendly. The quick turnarounds on custom development requests are the most impressive. The RealEye team delivers great tailored solutions. Thank you for being a wonderful partner!"

Sam Albert
Chief Digital Officer

"I'm really impressed with what Adam has created with RealEye. It's astounding how easy and fast it is to track and report on eye movement for a page or design."

David Darmanin
CEO, hotjar.com

"Webcam-based eye-tracking has vast potential within market research and RealEye made a great effort customizing their solutions to our needs. We succeeded in having live online interviews with eye-tracking included and we look forward to build on this pilot study to take further advantage of this solution in future research."

Stefan Papadakis
Insight Consultant, IPSOS
Trusted by freelancers, small to big companies, students, and universities.

Frequently Asked Questions

How accurate are the eye-tracking results?

RealEye studies are proven to be around 110 px accurate. This allows analyzing users interaction on a website with precision reaching the size of a single button.We predict the gaze point with frequency up to 60 Hz.For in-depth analysis of webcam eye-tracking accuracy check the following articles:

Who can participate in my study?

Either your own or RealEye participants. You can invite your users or panel and share the study with them using a participation link. All they need to have is a laptop/PC with a webcam.We also have a network of panelists from all over the world - mainly from the UK and the US. Randomly picked users can be assigned to your task. They are called RealEye participants. We will not show them your stimuli before the test starts, so their interaction will be natural.

Note: RealEye participants can't take part in a 'Live Website' studies and studies longer than 10 minutes. Read more about the limitations here.

Can I pay per study?

There are only monthly payments, so there’s no possibility to pay per study. But keep in mind that you can cancel your license any time (even in the same month) - you’ll keep the account access until the end of the billing period (30 days from the last payment).

Can I integrate RealEye with other tools?

You can easily integrate the RealEye tool with external tools (eg. surveys), but also compare the results obtained from other tools using our CSV file (i.e. by the timestamp).
Your browser does not support the HTML5 canvas tag.