Microsoft’s Kinect SDK Beta
A potential game-changer is coming to business in the form of gesture recognition, courtesy of Microsoft’s Kinect SDK, due for release in early 2012. A beta version of Kinect for Windows is already available for download on the Microsoft Research site to give developers and strategists an early look at its potential capabilities, according to Chloe Albanesius’ PC Magazine article. With a commercial release of the SDK, businesses could develop and profit on their own applications.
The Kinect platform works without the need for a touch screen, joystick or other manual input device. Rather, it uses gesture recognition–the user’s natural body movement–as input and responds accordingly. This opens up a new dimension for customer interaction, or even for robotic control in the industry. Already, gesture recognition is popping up in consumer items like Pantech’s latest smartphone offering. For the Pantech phone, calls taken while on the road can be answered using gesture instead of touch. Apple’s gestures patent application may poinnt to future iPad generations with gestures-based video-editing capabilities.
Microsoft reports that over 200 businesses are involved in the pilot program, though the types of applications that can be built with the Kinect SDK are as yet unknown. However, the potential for gesture technology to profoundly change the way we interact with computers is obvious. Equally exciting is the potential to combine technologies, such as gesture and speech recognition–something that could be exploited to great benefit. Consider, for example, an industry application that allows one person to use gestures and voice to control multiple robots, such as on an assembly line. Imagine being able to demonstrate to a dozen robots and command them to “fill the container like this” and have them execute the behavior perfectly and in tandem.
Perhaps the most compelling reason to take a careful look at gesture technology is in its potential to completely overhaul the basic user interface. Think about how graphical user interfaces have evolved; from traversing menus via mouse clicks, toggles, and swipes, the user experience has become more complex and in many cases, less intuitive. Using gestures could compress menu choices and simplify the customer experience. Customers for a clothing store, for example, could enter a virtual showroom, walk through it, pick clothing items off a rack, and even virtually try things on in a seemingly natural way that minimizes or even eliminates menus and mouse clicks.
Read more . . .
Bookmark this page for “Gesture Recognition” and check back regularly as these articles update on a very frequent basis. The view is set to “news”. Try clicking on “video” and “2” for more articles.