When Facebook reinvented itself into Meta in October 2021, it was widely reported that Meta would be focusing on virtual reality (VR) by being at the forefront of the metaverse.

But Meta has not given up on the world of bricks and mortar yet, as reflected by the company’s massive investment in augmented reality (AR) glasses.

My research considers smart real estate and human-computer interactions in smart environments.

Meta is only one among many companies betting that the future of physical space will involve merging with digital space, resulting in an augmentation of our reality. Apple, Google, Snap, Microsoft and a string of other tech companies are working on AR wearables: AR glasses, smart contact lenses and AR headsets.

Insight into the subconscious

As part of its Reality Labs, Meta spearheads Project Aria, which drives the pilot development of AR glasses under the umbrella of a research experiment undertaken with academic partners. The company promises that users will be able to use AR glasses to switch on a lamp by simply staring at it and being able to find their keys quickly.

However, there is one dimension of AR wearables that developers of such devices tend to downplay or ignore altogether: it is eye tracking and what information related to the way we interact with the world through our gazes and eye movements are captured and analyzed.

Psychologists have long identified that eye movements are unfiltered signals, giving insight into humans’ subconscious cognition.