Close Menu
  • Homepage
  • News
  • Cloud
  • ECommerce
  • Entertainment
  • Finance
  • Security
  • Podcast
  • Contact

Subscribe to Updates

Get the latest technology news from TechFinancials News about FinTech, Tech, Business, Telecoms and Connected Life.

What's Hot

EFF MP Forcibly Removed After Challenging DG On Mantashe Son’s SETA Role

2025-05-14

DA Exposes SAPS Body Camera Delay: No Cameras Deployed Yet

2025-05-14

Still No Ruling: Makate vs Vodacom Stalls As Court Keeps SA Waiting

2025-05-14
Facebook X (Twitter) Instagram
Trending
  • EFF MP Forcibly Removed After Challenging DG On Mantashe Son’s SETA Role
Facebook X (Twitter) Instagram YouTube LinkedIn WhatsApp RSS
TechFinancials
  • Homepage
  • News
  • Cloud
  • ECommerce
  • Entertainment
  • Finance
  • Security
  • Podcast
  • Contact
TechFinancials
Home»News»How does a computer know where you’re looking?
News

How does a computer know where you’re looking?

Gugu LourieBy Gugu Lourie2016-09-02No Comments5 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

Imagine driving a car, using a heads-up display projection on the windshield to navigate through an unfamiliar city. This is augmented reality (AR); the information is used to not only guide you along a route, but also to alert you to salient information in your surroundings, such as cyclists or pedestrians. The correct placement of virtual content is not only crucial, but perhaps a matter of life and death. By Ann McNamara

Information can’t obscure other material, and should be displayed long enough for you to understand it, but not too much longer than that. Computer systems have to make these determinations in real-time, without causing any of the information to be distracting or obtrusive. We certainly don’t want a warning about a cyclist about to cross in front of the car to obscure the cyclist herself!

As a researcher in AR, I spend a lot of time trying to figure out how to get the right information onto a user’s screen, in just the right place, at just the right moment. I’ve learned that showing too much information can confuse the user, but not showing enough can render an application useless. We have to find the sweet spot in between.

A crucial element of this, it turns out, is knowing where users are looking. Only then can we deliver the information they want in a location where they can process it. Our research involves measuring where the user is looking in the real scene, as a way to help decide where to place virtual content. With AR poised to infiltrate many areas of our lives – from driving to work to recreation – we’ll need to solve this problem before we can rely on AR to provide support for serious or critical actions.

Determining where to put information

It makes sense to have information appear where the user is looking. When navigating, a user could look at a building, street or other real object to reveal the associated virtual information; the system would know to hide all other displays to avoid cluttering the visible scene.

But how do we know what someone is looking at? It turns out that the nuances of human vision allow us to examine at a person’s eyes and calculate where they are looking. By pairing those data with cameras showing a person’s field of view, we can determine what the person is seeing and what he or she is looking at.

Eye-tracking systems first emerged in the 1900s. Originally they were mostly used to study reading patterns; some could be very intrusive for the reader. More recently, real-time eye-tracking has emerged and become more affordable, easier to operate and smaller.

Eye-tracking spectacles can be relatively compact. Anatolich1, CC BY-SA
Eye-tracking spectacles can be relatively compact. Anatolich1, CC BY-SA

Eye trackers can be attached to the screen or integrated into wearable glasses or head-mounted displays. Eyes are tracked using a combination of the cameras, projections and computer vision algorithms to calculate the position of the eye and the gaze point on a monitor.

We generally look at two measures when examining eye tracking data. The first is called a fixation, and is used to describe when we pause our gaze, often on an interesting location in a scene because it has caught our attention. The second is a saccade, one of the rapid eye movements used to position the gaze. Basically, our eyes quickly dart from place to place taking in pieces of information about parts of a scene. Our brains then put the information from these fixations together to form a visual image in our minds.

Short periods of fixation are followed by quick movements, called saccades.

Combining eye tracking with AR

Often AR content is anchored to a real-world object or location. For example, a virtual label containing a street name should be displayed on that street. Ideally, we would like the AR labels to appear close to the real object it is associated with. But we also need to be careful not to let multiple AR labels overlap and become unreadable. There are many approaches to managing label placement. We’re exploring one option: calculating where the person is looking in the real scene and displaying AR labels only in that spot.

Augmented reality can provide additional information to shoppers. Augmented reality image via shutterstock.com
Augmented reality can provide additional information to shoppers. Augmented reality image via shutterstock.com

Say, for example, a user is interacting with a mobile application that helps him shop for low-calorie cereal in the grocery store. In the AR application, each cereal has calorie information associated with it. Rather than physically picking up each cereal box and reading the nutritional content, the user can hold up his mobile device and point it at a particular cereal box to reveal the relevant information.

But think about how crowded a store’s cereal aisle is with various packages. Without some way to manage the display of AR labels, the calorie information labels for all the cereal boxes would be displayed. It would be impossible to identify the calorie content for the cereal he is interested in.

By tracking his eyes, we can determine which individual cereal box the user is looking at. Then we display the calorie information for that particular cereal. When he shifts his gaze to another box, we display the figures for the next one he considers. His screen is uncluttered, the information he wants is readily available and when he needs additional information, we can display that.

This type of development makes it an exciting time for AR research. Our ability to integrate real-world scenes with computer graphics on mobile displays is improving. This fuels the prospect of creating stunning new applications that expand our ability to interact with, learn from and be entertained by the world around us.

The Conversation

  • Ann McNamara, Associate Professor of Visualization, Texas A&M University
  • This article was originally published on The Conversation. Read the original article.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Gugu Lourie
Gugu Lourie

Related Posts

EFF MP Forcibly Removed After Challenging DG On Mantashe Son’s SETA Role

2025-05-14

Openserve Prepaid Fibre: Affordable, Flexible Connectivity On Demand

2025-05-13

Soweto Teens Turn Heads With Flashy Custom BMX Bike Showcase

2025-05-13

Supreme Court Of Appeal Hears Zimbabwean Permit Case

2025-05-12

First Group Of 49 Afrikaners Leave For U.S. To Become Refugees

2025-05-12

Lesaka Gains 89K New Grant Users, Eyes 20% Market Share

2025-05-11

Gauteng Doctors Unpaid, Food Shortages Hit Public Hospitals

2025-05-11

US Pulls Funding For South African Medical Research

2025-05-09

Are Global CEOs Committed To Advancing AI Solutions?

2025-05-06
Leave A Reply Cancel Reply

DON'T MISS
Breaking News

Microsoft Cuts 6,000 Jobs, 3% of Workforce, Amid Restructuring

Microsoft on Tuesday said that it’s laying off 3% of employees across all levels, teams,…

Minister Nkabane Appoints ANC Cadres, Mantashe’s Son To SETA Boards

2025-05-13

TV Licences Are Outdated, But Is A Streaming Levy The Right Fix?

2025-03-17

US-China Trade Wars: Their Impact On Africa

2025-03-07
Stay In Touch
  • Facebook
  • Twitter
  • YouTube
  • LinkedIn
OUR PICKS

Still No Ruling: Makate vs Vodacom Stalls As Court Keeps SA Waiting

2025-05-14

Investec Applies For Electricity Trading Licence In SA

2025-05-14

Phygital Shopping Rises In SA: Blending Online & In-Store

2025-04-18

Foreigner Nabbed With 554 Cellphones Worth R2.5m In Bloemfontein

2025-04-18

Subscribe to Updates

Get the latest tech news from TechFinancials about telecoms, fintech and connected life.

About Us

TechFinancials delivers in-depth analysis of tech, digital revolution, fintech, e-commerce, digital banking and breaking tech news.

Facebook X (Twitter) Instagram YouTube LinkedIn WhatsApp Reddit RSS
Our Picks

EFF MP Forcibly Removed After Challenging DG On Mantashe Son’s SETA Role

2025-05-14

DA Exposes SAPS Body Camera Delay: No Cameras Deployed Yet

2025-05-14

Still No Ruling: Makate vs Vodacom Stalls As Court Keeps SA Waiting

2025-05-14
Recent Posts
  • EFF MP Forcibly Removed After Challenging DG On Mantashe Son’s SETA Role
  • DA Exposes SAPS Body Camera Delay: No Cameras Deployed Yet
  • Still No Ruling: Makate vs Vodacom Stalls As Court Keeps SA Waiting
  • Investec Applies For Electricity Trading Licence In SA
  • SA Prepares Trade Package For Trump Meeting
TechFinancials
RSS Facebook X (Twitter) LinkedIn YouTube WhatsApp
  • Homepage
  • Newsletter
  • Contact
  • Advertise
  • About
© 2025 TechFinancials. Designed by TFS Media.

Type above and press Enter to search. Press Esc to cancel.

Ad Blocker Enabled!
Ad Blocker Enabled!
Our website is made possible by displaying online advertisements to our visitors. Please support us by disabling your Ad Blocker.