
via University of Waterloo
New program recognizes users’ hands beside or near the keyboard and prompts operations based on different hand positions
Researchers are developing a new technology that uses hand gestures to carry out commands on computers.
The prototype, called “Typealike,” works through a regular laptop webcam with a simple affixed mirror. The program recognizes the user’s hands beside or near the keyboard and prompts operations based on different hand positions.
A user could, for example, place their right hand with the thumb pointing up beside the keyboard, and the program would recognize this as a signal to increase the volume. Different gestures and different combinations of gestures can be programmed to carry out a wide range of operations.
The innovation in the field of human-computer interaction aims to make user experience faster and smoother, with less need for keyboard shortcuts or working with a mouse and trackpad.
“It started with a simple idea about new ways to use a webcam,” said Nalin Chhibber, a recent master’s graduate from the University of Waterloo’s Cheriton School of Computer Science. “The webcam is pointed at your face, but the most interaction happening on a computer is around your hands. So, we thought, what could we do if the webcam could pick up hand gestures?”
The initial insight led to the development of a small mechanical attachment that redirects the webcam downwards towards the hands. The team then created a software program capable of understanding distinct hand gestures in variable conditions and for different users. The team used machine learning techniques to train the Typealike program.
“It’s a neural network, so you need to show the algorithm examples of what you’re trying to detect,” said Fabrice Matulic, senior researcher at Preferred Networks Inc. and a former postdoctoral researcher at Waterloo. “Some people will make gestures a little bit differently, and hands vary in size, so you have to collect a lot of data from different people with different lighting conditions.”
The team recorded a database of hand gestures with dozens of research volunteers. They also had the volunteers do tests and surveys to help the team understand how to make the program as functional and versatile as possible.
“We’re always setting out to make things people can easily use,” said Daniel Vogel, an associate professor of computer science at Waterloo. “People look at something like Typealike, or other new tech in the field of human-computer interaction, and they say it just makes sense. That’s what we want. We want to make technology that’s intuitive and straightforward, but sometimes to do that takes a lot of complex research and sophisticated software.”
The researchers say there are further applications for the Typealike program in virtual reality where it could eliminate the need for hand-held controllers.
Original Article: System recognizes hand gestures to expand computer input on a keyboard
More from: University of Waterloo
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Typealike
- Feed has no items.
Go deeper with Google Headlines on:
Typealike
[google_news title=”” keyword=”Typealike” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
Human-computer interaction
- ReWalk Robotics Demonstrates AI Autonomous Decision Making in Next Generation Exoskeleton Prototype
R&D Grant from the Israeli Technology Innovation Consortium Enabled the Successful Demonstration of a Working Prototype MARLBOROUGH, Mass. and BERLIN and YOKNEAM ILLIT, Israel, Nov. 28, 2023 (GLOBE ...
- What is Human-Computer Interaction (HCI)?
Human-computer interaction (HCI) is a multidisciplinary field of study focusing on the design of computer technology and, in particular, the interaction between humans (the users) and computers. While ...
- A Brief History of Human-Computer Interaction
Human Computer Interaction is the academic discipline that most of us think of as UI design. It focuses on the way that human beings and computers interact to ever increasing levels of both complexity ...
- Ada CEO Mike Murchison is using AI to 'make customer service extraordinary'
Murchison's call centre insights helped shape the work and mission of his startup, Ada, which uses AI to help businesses and brands automate their customer support.
- Microsoft unveils text-to-speech avatar tool in deepfake era
Microsoft has introduced a new text-to-speech feature with vision capabilities, that enables users to create talking avatar videos with text input, and to build real-time interactive bots trained ...
Go deeper with Google Headlines on:
Human-computer interaction
[google_news title=”” keyword=”human-computer interaction” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]