Insights From a 200 Website Eye-Tracking Study

On January 27, 2014
How to improve user attention to your website? This is a crucial question because visitors decide instantly whether to proceed and look further or to just click away.
The German company Whitematter Labs GmbH has developed EyeQuant, a  patent-pending neurotechnology that helps companies optimize user attention. EyeQuant uses eye-tracking in order to teach computers to see the web like humans do, finding the statistical patterns that power our attention models (which we can use to instantly test our websites). Recently they shared the insights they found after a 200 website eye tracking study.
We are always interested to know what drives (and doesn’t drive) attention, especially with websites design. As researchers from EyeQuant want to further expand their model’s predictive capacities (these predictions are based on a new EyeQuant model that’s currently in early testing; currently this model provides over 75% predictive accuracy; their standard model achieves over 90%), they analyzed data from one of their recent eye-tracking studies with 46 subjects who were purchasing products on 200 AdWords eCommerce pages. They recorded 261,150 fixations in total, as users were looking at each webpage for 15 sec (+/- 6 sec) on average. The study was conducted in the Neurobiopsychology Lab at the University of Osnabrueck, Germany.
1. Most of us believe in the following assumption about human attention: “Faces always and instantly draw attention.“. Sometimes this is corrent, but the study shows that there are cases when it isn’t. We like faces, we look at them sometimes, and we even have a dedicated brain area involved in processing facesHowever, we look at them much less often than we would all typically believe. 
Here’s an example: a Levis landing page: Left:  Eye-Tracking heatmap of users visiting a Levi’s landing page – users are almost completely ignoring the faces. EyeQuant’s prediction on the right puts a bit more emphasis on the logo than the empirical data, but the big winner on this one is the clearly the headline copy, not the faces.
Another example: a hotel search website, featuring an incredibly happy couple with clearly visible faces. Yet users only seem to care about the search box and the call to action in the center. EyeQuant’s new model provides a very similar result but gets a bit distracted by the wooden texture.
More examples from eCommerce shop to web 1.0 wall-of-text:
eye-tracking faces
Results show that faces aren’t the powerful attention-grabbers as one usually thinks they are.
What about guiding user attention through faces? This is another popular assumption which seems to make a lot of sense: we’re social beings and user gaze follows the gaze of faces on a website. Again, that’s true, except for when it isn’t.
Here’s an example of a Hilton Hotel landing page. Users go straight for the search form and check the offers below, but aren’t paying too much attention to the woman or the headline she’s staring at.
eyequant vs eye-tracking validation
2.Large text is a great way to attract user attention” is another popular idea about how attention works online. However, EyeQuant data shows that it usually doesn’t work. In a lot of cases big fonts even seem to have a negative effect on attracting attention. Below is an example of an English Proof Read landing page. Big typography doesn’t work nearly as well as you think it would. The winner on this one are the three descriptive areas below.
Screen Shot 2014-01-14 at 19.23.52
Another example of Canadian Railways website can be seen below. Users had the task to purchase a rail ticket deal. They promptly ignored the advertised one which is using big fonts. Note how this result includes another example for how faces doesn’t always guide attention.
eyequant eye-tracking
Researchers concluded that big typography is visually loud, but not at all a safe way to grab user attention. So we need to look into other ways as well.
3. Economically, nothing beats ‘FREE’. But does this also mean that the word pops out to users immediately when they’re visitig a page? EyeQuant data says otherwise. In the examples below, note how EyeQuant’s automatic prediction (on the left) does pick up a little bit on the copy that contains “free”, whereas users in the empirical study on the right completely ignored it. Both study and prediction place almost all the attention on the product description and the model.
eyequant vs eye-tracking
EyeQuant validation eye-tracking
Their conclusion: Free is a powerful semantical tool. We shouldn’t rely on it as our main attention grabber though!
After seeing this results that add a question mark to 3 important myths used nowadays in webdesign, is it clear that testing always beats guessing.
At the end, here’s EyeQuant prediction for webpage and one of the blog posts:

One Response to “Insights From a 200 Website Eye-Tracking Study”

  • Applying a generic conclusion from the displayed Levis heatmap for e-commerce shops is strongly not recommended. A landing page has the purpose of leading to user to the next action – where Levis is doing a great job in this example, because it fits their corporate identity.

    Not each model, face, picture, scenery, etc. has the same effect, but pictures defensively have effects – positive or negative. It is clear that testing with samples of real potential customers always beats guessing, independent if guessing is done by a designer or by software.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

%d bloggers like this: