The ongoing Google I/O 2015 is a two day event, and the first day saw the announcement of some key changes in the world of Android: From Android M, to the new Photos App, to Google Cloud Messaging service. But the true innovations were announced on the second day, when the Advanced Technology and Projects (ATAP) Team took the stage.
One of their major announcements was that of Project Soli – a groundbreaking method of interacting with wearable devices.
Here’s all you need to know about it!
Project Soli aims at the removal of an interim input device (such as a keyboard, mouse or even a touchscreen) and make the human body the only source of input for a wearable device. This is accomplished through gesture sensing and recognition.
Gesture control exists today in the form of a multitude of applications and devices (such as the Leap Motion). However, the problem with gesture control today is accuracy: more often than not, gestures have to be repeated again and again.
Project Soli employs the usage of radar to detect hand motions. As the ATAP team explained it, it is a radar based sensor that is capable of tracking extremely minute motions at high speed and accuracy: things like sliding the thumb along the index finger to change volume controls, tapping the tips of the thumb and index to press a button and so on.
So Project Soli involved a tiny radar chip, which has a refresh rate of upto 10,000 frames per second, which again improves the accuracy.
EVOLUTION OF PROJECT SOLI
According to the ATAP keynote, the team took 10 months to reduce the size from that of a console to a tiny chip, which is indeed an amazing technological feat.
Much like all their innovative ventures, Google will release the Project Soli to developers to enable them to come up with applications and uses. Although Google hasn’t given a specific date yet, this chip will be present inside devices sometime later this year.