Saturday 1 October 2016 | RSS

 
    Feedback
     
     

     

    Google’s Project Soli to bring hand controls to wearables

    Project Soli

    PROJECT SOLI: Hand-gesture technology aims to enable rich interactions with smart devices

    Google is working to create next-generation, hand-gesture technology that would enable rich interactions with smart devices regardless of screen size to “drive applications across wearable, home automation, automotive, industrial and medical markets”.

    A video produced by the company’s Advanced Technology and Projects (ATAP) group shows a demonstration of how Project Soli works:

    “The hand is the ultimate input device; it’s extremely precise, it’s extremely fast and it’s very natural for us to use it,” says ATAP’s Ivan Poupyrev, founder of Project Soli.

    “We use radio frequency spectrum, radars, to track human hands. We’re using them to track micro motions, twitches of the human hand, and then use that to interact with wearables and Internet of Things and other computing devices.”

    More headlines...