
Researchers developed a simple application for devices that use Apple’s iOS operating system. The application flashes a small dot somewhere on the device’s screen, attracting the user’s attention, then briefly replaces it with either an “R” or an “L,” instructing the user to swipe either the right or left side of the screen. Correctly executing the swipe ensures that the user has actually shifted his or her gaze to the intended location. During this process, the device camera continuously captures images of the user’s face.
Illustration: Christine Daniloff/MIT
Crowd-sourced data yields system that determines where mobile-device users are looking.
For the past 40 years, eye-tracking technology — which can determine where in a visual scene people are directing their gaze — has been widely used in psychological experiments and marketing research, but it’s required pricey hardware that has kept it from finding consumer applications.
Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory and the University of Georgia hope to change that, with software that can turn any smartphone into an eye-tracking device. They describe their new system in a paper they’re presenting on June 28 at the Computer Vision and Pattern Recognition conference.
In addition to making existing applications of eye-tracking technology more accessible, the system could enable new computer interfaces or help detect signs of incipient neurological disease or mental illness.
“The field is kind of stuck in this chicken-and-egg loop,” says Aditya Khosla, an MIT graduate student in electrical engineering and computer science and co-first author on the paper. “Since few people have the external devices, there’s no big incentive to develop applications for them. Since there are no applications, there’s no incentive for people to buy the devices. We thought we should break this circle and try to make an eye tracker that works on a single mobile device, using just your front-facing camera.”
Khosla and his colleagues — co-first author Kyle Krafka of the University of Georgia, MIT professors of electrical engineering and computer science Wojciech Matusik and Antonio Torralba, and three others — built their eye tracker using machine learning, a technique in which computers learn to perform tasks by looking for patterns in large sets of training examples.
Strength in numbers
Khosla and his colleagues’ advantage over previous research was the amount of data they had to work with. Currently, Khosla says, their training set includes examples of gaze patterns from 1,500 mobile-device users. Previously, the largest data sets used to train experimental eye-tracking systems had topped out at about 50 users.
To assemble data sets, “most other groups tend to call people into the lab,” Khosla says. “It’s really hard to scale that up. Calling 50 people in itself is already a fairly tedious process. But we realized we could do this through crowdsourcing.”
In the paper, the researchers report an initial round of experiments, using training data drawn from 800 mobile-device users. On that basis, they were able to get the system’s margin of error down to 1.5 centimeters, a twofold improvement over previous experimental systems.
Since the paper was submitted, however, they’ve acquired data on another 700 people, and the additional training data has reduced the margin of error to about a centimeter.
To get a sense of how larger training sets might improve performance, the researchers trained and retrained their system using different-sized subsets of their data. Those experiments suggest that about 10,000 training examples should be enough to lower the margin of error to a half-centimeter, which Khosla estimates will be good enough to make the system commercially viable.
To collect their training examples, the researchers developed a simple application for devices that use Apple’s iOS operating system. The application flashes a small dot somewhere on the device’s screen, attracting the user’s attention, then briefly replaces it with either an “R” or an “L,” instructing the user to tap either the right or left side of the screen. Correctly executing the tap ensures that the user has actually shifted his or her gaze to the intended location. During this process, the device camera continuously captures images of the user’s face.
The researchers recruited application users through Amazon’s Mechanical Turk crowdsourcing site and paid them a small fee for each successfully executed tap. The data set contains, on average, 1,600 images for each user.
Tightening the net
The researchers’ machine-learning system was a neural network, which is a software abstraction but can be thought of as a huge network of very simple information processors arranged into discrete layers. Training modifies the settings of the individual processors so that a data item — in this case, a still image of a mobile-device user — fed to the bottom layer will be processed by the subsequent layers. The output of the top layer will be the solution to a computational problem — in this case, an estimate of the direction of the user’s gaze.
Neural networks are large, however, so the MIT and Georgia researchers used a technique called “dark knowledge” to shrink theirs. Dark knowledge involves taking the outputs of a fully trained network, which are generally approximate solutions, and using those as well as the real solutions to train a much smaller network. The technique reduced the size of the researchers’ network by roughly 80 percent, enabling it to run much more efficiently on a smartphone. With the reduced network, the eye tracker can operate at about 15 frames per second, which is fast enough to record even brief glances.
“In lots of cases — if you want to do a user study, in computer vision, in marketing, in developing new user interfaces — eye tracking is something people have been very interested in, but it hasn’t really been accessible,” says Noah Snavely, an associate professor of computer science at Cornell University. “You need expensive equipment, or it has to be calibrated very well in order to work. So something that will work on a device everyone has, that seems very compelling. And from what I’ve seen, the accuracy they get seems like it’s in the ballpark that you can do something interesting.”
“Part of the excitement is that they’ve also created this way of collecting data, and also the data set itself,” Snavely adds. “They did all the legwork that will make other people interested in this problem. And the fact that the community will start working on this will lead to fast improvements.”
Learn more: Eye-tracking system uses ordinary cellphone camera
The Latest on: Eye-tracking system
[google_news title=”” keyword=”eye-tracking system” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
via Google News
The Latest on: Eye-tracking system
- Handmade GPS Tracker Keeps An Eye On Adventurous Catson January 30, 2023 at 4:00 pm
The heart of the device is an A9G GSM/GPS board, based on the RDA8955 system-on-chip. [Sahas] combined this with ... it with a simple routine that waits for an MQTT message to start tracking, and ...
- Smart Eye Announces Three New Driver Monitoring System Design Winson January 30, 2023 at 6:52 am
GÖTEBORG, SWEDEN / ACCESSWIRE / / Smart Eye (STO:SEYE) (OTC:SMTEF) (FRA:SE9) Smart Eye has been nominated to deliver its world-leading Driver Monitoring System (DMS) software to three additional car ...
- Very strong winds today -- tracking rain chances into the weekendon January 27, 2023 at 3:24 pm
WSIL (Carterville) -- Keep an eye on your trash can ... around 1 AM ahead of our next system. Temperatures will fall near freezing. If you have any outdoor plans for tomorrow you should be good to go.
- Eye Tracking Solutions Market | Rising Sales of the Industry Are Set To Drive for Growth Prospects and Innovations | 2029on January 26, 2023 at 5:40 am
In 2023, Current Eye Tracking Solutions Market Size | 2029 | Newest [122] Pages Report Eye Tracking Solutions Market ...
- Apple's Mixed Reality headset could feature advanced hand tracking and may work as a second display for Macon January 24, 2023 at 10:27 pm
Apple’s much anticipated and long-rumored AR/ VR headset could be packed with ambitious technologies, including an advanced hand and eye tracking system, the ability to see your Mac’s display, and ...
- From Eye Tracking To iOS Interface, Here’s Everything New About Apple Mixed Reality Headseton January 24, 2023 at 10:25 pm
Safari, Photos, Mail, Messages, Apple TV+, Apple Music, Calendar, and Podcasts are other compatible applications.
- Nikon’s Z50 gets eye-detection AF for video with 2.40 firmware updateon January 24, 2023 at 9:16 am
Nikon has released a fairly substantial firmware update for its Z50 camera system that now brings eye-detection AF to video capture. Firmware version 2.40 for the Nikon Z50 is now available to ...
- Apple's AR/VR headset will reportedly feature unique eye, hand trackingon January 24, 2023 at 1:26 am
Apple's highly anticipated, mixed reality device will also allow users to participate in FaceTime calls with realistic VR renderings, according to a new Bloomberg report.
- Apple's headset to use 3D operating system that looks like iOSon January 23, 2023 at 11:45 am
Apple's VR headset will feature a 3D operating system that will resemble iOS with hand and finger gesture recognition — and can be used as a macOS display.
- How Apple’s Upcoming Mixed-Reality Headset Will Workon January 23, 2023 at 10:27 am
Apple Inc.’s long-anticipated mixed-reality headset is an ambitious attempt to create a 3D version of the iPhone’s operating system, with eye- and hand-tracking systems that could set the ...
via Bing News