Google may have just made every motion controller currently on the market obsolete.

At a time when most gesture-sensing technology is unreliable and clunky, Project Soli, one of Google latest cutting-edge experiments from its secretive Advanced Technology and Projects group (ATAP), provides an enticing example of the type of powerful motion controller that could actually change how we interact with everything from smartwatches and tablets to appliances and other everyday objects.

At a basic level, motion controllers are premised on the idea that a user’s hands replace traditional input devices like touch screens or mouse and keyboards. Rather than touching a physical object — like a display or button — to control a device, you use hand gestures. Using hand gestures, proponents say, makes user interfaces much more intuitive and easy to use and opens up new ways for designers and developers to create better user experiences.

Radar to gestures

Project Soli’s gesture-tracking takes a particularly unique approach in that it depends on radar. Radar, which detects objects in motion through high frequency radio waves, enables what Project Soli’s design lead Carste Schwesig calls a “fundamentally different approach” to motion tracking.

“A typical model of the way you think about radar is like a police radar or baseball where you just have an object and you measure its speed,” explains Schwesig.

“But actually we are beaming out a continuos signal that gets reflected by an arm, for example…so you measure the differences between the emitted and the received signal. It’s a very complex wave signal and from that we can provide signal processing and machine learning techniques to detect gestures.”

 

Of course, gesture-based controllers are not, in themselves, new. Companies like Leap Motion and, more recently, Intel (via RealSense) have been experimenting with motion controllers for some time. But these systems rely on cameras for their motion-tracking abilities, which limits the effectiveness and accuracy of the devices, says Schwesig.

 

Since Soli’s sensors can capture motion at up to 10,000 frames per second, it is much more accurate than camera-based systems

Since Soli’s sensors can capture motion at up to 10,000 frames per second, it is much more accurate than camera-based systems, which track motion at much lower frame rates, Schwesig says. And unlike cameras, radar can pass through certain types of objects, making it adaptable to more form factors than a camera.

“You can do things you would never be able to do with a camera,” Schwesig tells Mashable. “The speed doesn’t mean you have to move extremely fast, it just means you can detect very high accuracy.”

Where it’s going

When project lead Ivan Poupyrev demoed it onstage during I/O, he talked about Project Soli mainly in the context of smartwatches. As displays shrink, he said, interacting with devices becomes increasingly difficult. Even the most responsive smartwatch displays can be difficult to navigate in some situations. But Soli’s utility isn’t limited to wearables at all. In its current form, its radar tech lives in a single tiny chip that can be embedded in about any type of device, even objects that don’t have a traditional display.

 

“It’s in chip form, [since] there are no moving parts involved it can be embedded inside devices, it can work through some materials, we can reimagine this in everyday objects or even with existing products,” Schwesig said

Imagining gesture interfaces on everyday objects is particularly intriguing

Imagining gesture interfaces on everyday objects is particularly intriguing: ATAP used the example of an analog radio where gestures control the volume and station. But it could be applied to any number of use cases. Soli’s sensors can detect motion at a range of about two to three feet, Schwesig says, so any device you use within that range stands to benefit. Imagine dismissing smartphone notification with the wave of a hand or pressing your fingers together to play music from a bluetooth speaker.

What’s next

ATAP plans to open up Soli to developers by offering a development kit that will enable them to build their own applications for the gesture controller, though details around timing and exactly how they plan to make it available haven’t been detailed. It’s possible Project Soli’s developer rollout could take a similar approach as ATAP’s Project Ara. With Ara, people submitted applications for a chance to get their hands on the project’s development hardware and participate in developer conferences.

Even with a planned developer rollout, Project Soli is still very much an experiment. Where the radar-based gesture tech eventually ends up will likely depend on developer response and the level of interest from hardware manufacturers. One obvious example would be Android Wear watchmakers: It’s no secret that Android Wear sales have been lackluster since the first devices went on sale last year. Soli’s new gesture-based interface could potentially revitalize sales.

But the size and flexibility of the chip itself leaves many, many, more possibilities open. In hands-on demos at Google I/O, ATAP focused more on displaying Soli’s gesture-recognizing capabilities rather than specific implementations. A prototype I saw used visualizations to show how the chip was able to detect and respond to various hand motions. The shapes and position of the visualization changed in response to how people moved their hands.

So for now, it seems it will be up to developers and hardware manufacturers to help determine the future of Project Soli. ATAP has already shown, in just one day, that the potential is huge. Now, the self-described “small band of pirates” that likes “epic shit,” just needs to let in developers and others who will be able to help take the project out of ATAP’s lab and, hopefully, into our homes.

 

 

Source: https://mashable.com/2015/05/30/google-project-soli-analysis/#BRTNYSapcgqI