General AI News
Jul 31, 2018 ● Robin Andrews
This AI Can Determine A Person's Personality Just By Tracking Their Eye Movements

Four of the Big Five personality traits could be determined just from a limited data set

Reminding us all that body language plays a major role in communication, an international research team has conjured up an artificial intelligence (AI) that is capable of something rather remarkable: It can approximate the type of personality you have based on how your eyes move.

Writing in Frontiers In Human Neuroscience, the team reveals that four of the so-called Big Five personality traits – neuroticism, extraversion, agreeableness, and conscientiousness – as well as “perceptual curiosity” (giving attention to interesting, novel stimuli) could be determined just from a limited data set. Only the last of the Big Five, openness, couldn’t be adequately determined.

“Besides allowing us to perceive our surroundings, eye movements are also a window into our mind and a rich source of information on who we are, how we feel, and what we do,” the University of Stuttgart-led team explain in their paper. Their new research not only supports this claim, but highlights subtle and previously neglected eye movements that give away aspects of someone’s personality.

The study only used 50 students and staff at Flinders University, the vast majority of which were female, which means that it’s too early to say how effective this AI’s predictive abilities truly are. Saying that, this remains an engrossing proof-of-concept study, so how did it work?

The participants had to wear a state-of-the-art, head-mounted video-based eye tracker, which I’m sure looked absolutely fabulous. It recorded the subjects’ eye movements as they went on a 10-minute-long routine buying an item of their choice from a campus shop.

At the same time, the subjects were quizzed on their personality traits using standard-issue psychological questionnaires. These would later be used to see how well the AI accomplished its unprecedented task.

The AI, developed by the team based on reams of pre-existing data, took into account plenty of theories regarding eye behavior. In crude terms, the manner in which we look at things gives hints as to the thought processes that are driving those ocular actions.

Such actions included specific types of eye movements, such as saccades – rapid eye movements designed to build up an integrated picture of our surroundings. Other behaviors, such as the length someone fixes their gaze on an object, or how variable the pupil diameter is during said fixations, were also taken into account.

Using just those 10 minutes of data per person, the team found that the AI succeeded in gaining a broad understanding of the person behind the video. Noting that its predictions were “clearly above chance level,” the team also add that it’s not yet ready for practical applications.

One such application suggested by the team involves robots that socially interact with humans. Using their AI, they could be trained to use such eye movements to pick up on subtle emotional cues, and thereby allow a more natural interaction.

The team also acknowledge that machine learning methods like theirs normally require millions of sample, and they hope that their next step will involve a more representative sample of the general population to better prove their AI’s worth. They’re also unsure how things like mood or even the person’s awareness of the eye tracking equipment affect results.

Then, there’s the curious nature of the AI’s inability to detect openness. At present, they aren’t sure whether this is down to experimental design flaws of if eye-tracking technology simply cannot detect it.

This study nevertheless represents a stepping stone for both technological and psychological sciences. Beyond getting AI to understand human personalities without a word being uttered, it’s also a curious insight into a currently unanswered question: How much of our minds do we give others access to just through our eyes movements?


This article originally appeared in IFL Science

Article by:

Robin Andrews