
Figure 1. A study participant playing GetH2O on a mobile phone. The red dot indicates the point of gaze.
When we play test games at Valsplat, we often use eye tracking to enrich our observations. Eye tracking leaves less room for assumptions and helps us better understand player behavior. As a result, we—and the game developers we work with—can make more informed decisions, and therefore better games.
Apart from providing a general aid for observation, eye tracking can reveal unique things about a player. It can offer qualitative insights that would be difficult to obtain otherwise. This article is about those insights.
Visual Cues within the Game
Eye tracking can help us determine whether players see and understand the visual cues within a game. For example, in a playtest for the PlayStation 3 first-person shooter Killzone 3, players walked through a jungle level with some patches of dense forest. Eye tracking revealed that players were scanning the screen, but not finding their way out of the denser parts. It was too dark to see the way out.
As a solution, the developers created light cues in the jungle, subtly hinting the correct path, but without making it too easy. In the subsequent test, we found that players could find their way out after some searching.
In the mobile version of the peace-building game GetH2O (see Figure 1), eye tracking allowed us to see that players didn’t always understand the visual language in the game. Some objects were not recognized. For example, when the game reported that a river was polluted, players would scan the screen looking for a polluted river. Even though they looked at the brown areas indicating the polluted river, they kept on searching. This observation revealed that players didn’t recognize the representation of the polluted river.
Heads-up Displays
Eye tracking can also identify issues with the heads-up displays (HUDs). During gaming, the HUD shows game-related information such as score or ammunition. We playtested SXPD, a high-speed pursuit racing game for the iPad (see Figure 2). The initial test pinpointed two major HUD issues. Then the game was iterated and retested.

Figure 2. In our first study, players of SXPD overlooked buttons and enemies. These problems were solved in this improvised version of the game.
The first problem was that the fire and brake buttons, essential elements of the game, were overlooked. When the game starts, you immediately start racing at a very high speed. Eye tracking showed that players were fully focused on the screen center, paying attention only to racing and avoiding obstacles (steering was done using the iPad’s gyroscope), missing the buttons on the left and right of the screen.
For the second test, the developers made a subtle change: the motor starts racing immediately, but the virtual car is not yet controllable. A countdown shows how many seconds to the start of the race. This change worked quite well. During countdown, players noticed the fire buttons on the left and right side of the screen. The brake buttons, however, were still overlooked.
The second issue in the first test was that players didn’t know where their enemies were. When enemies were racing behind the player, red arrows indicated their location. The further their enemies were behind them, the smaller the arrows were. Eye tracking showed that the players were so focused on racing that they didn’t look at the arrows. Even when enemies were onscreen in sight, players didn’t always see them; the game was in black and white, and enemies didn’t stand out.
For the second test, the HUD was stripped to its bare essentials. The arrows indicating offscreen enemies were enlarged. Onscreen enemies now had red circles around them. Eye tracking showed that players now noticed enemies, both onscreen and offscreen.
The Game Menu
Eye tracking can also be very useful for identifying issues with the game menu. When testing Gamepoint, a large casual gaming portal (see Figure 3), players were overwhelmed by the visual design of the menu. They would scan the entire screen, looking for cues about where to go. But the most important buttons and menu items didn’t draw attention. As a result of the study, the menu will be cleaned up and the most important buttons and flows will be emphasized.

Figure 3. The players of Gamepoint found it difficult to location the primary action buttons.
Instructions
When testing a web game for high school kids, we observed that they rarely read pre-game instructions. Players quickly scanned the text, and as soon as they saw the “Next” button, they clicked it. In the GetH2O mobile phone game, we saw adults doing the same thing. Players were looking for some action; they wanted to play, not read. Based on eye tracking test results, the developers plan to shorten instructions to include only the essentials, as well as make the information more visually appealing.
By contrast, in the SXPD game, the pre-game story is told in a graphic novel style. Eye tracking showed that players read the story and looked at the drawings. So when text is visually appealing, it can encourage the player to read, at least in pre-game segments.
Conclusion
When used during playtesting, eye tracking can reveal what draws a player’s attention. This additional layer of information can clarify player behavior in general, but also in specific interface areas such as visual cues, instructions, and game menus. And even though eye tracking is not the Holy Grail, it’s still a very useful tool to help developers understand their players and, in the end, make better games.
Retrieved from https://oldmagazine.uxpa.org/better-games-insights-from-eye-tracking/