Monday, January 4, 2016

Eye tracking in usability tests

A few weeks ago, I read an opensource.com article about a Python-based open source eye tracking tool.

Eye tracking can be an important tool in usability testing. When we conduct a usability test, we usually ask participants to speak aloud whatever they are thinking during the test. For example, if the tester is looking for a Print button, we encourage the tester to say "I'm looking for a Print button." Using the "speak aloud" method allows the moderator or observer to take notes on what happened while the tester was trying to complete each scenario task.

This works well as long as testers are willing to talk out loud and give a "stream of consciousness" narration. Some testers do this better than others; some prefer not to do it at all. But without that input, we don't know why a tester had problems completing a task. Was the tester looking for a menu instead of an icon on the tool bar? Where was the tester looking on the screen for the solution? If we know the answers to these questions, we can better understand how users approach the software. In turn, the designers and developers can modify the interface to make the software easier to use.

That's where I wish for easily available eye tracking. And with PyGaze, it looks like this may finally be within reach of open source usability testing! PyGaze is a set of software that, among other things, provides eye tracking. You can learn more about PyGaze, including samples of heat maps, fixation maps, and scan paths, at the page that describes PyGaze Analyzer.

Here's a sample image from the PyGaze website, showing an eye tracking session for a website. "Figure 7 shows that our volunteer first looked at the pictures on the documentation site of OpenSesame (an open-source graphical experiment builder that’s becoming popular in the social sciences), and then started to read the text."



As we do more usability testing with GNOME via Outreachy, I hope our future interns can get PyGaze working so we can examine eye tracking along with our other usability data.
image: Alper Çuğun

2 comments:

  1. Using a webcam at a far distance from the eye (e.g. a laptop webcam) doesn't give good results.
    You need a special hardware. You can find cheap and open source hardware like the Pupil:
    https://pupil-labs.com/
    But the Pupil is still under development. Maybe a plugin needs to be written for your use case.

    ReplyDelete
  2. I think it's great to have many options for eye tracking! This will really help open source usability testing.

    ReplyDelete

Note: Only a member of this blog may post a comment.