Google has received a waiver from the Federal Communications Commission to continue work on a radar-based project that provides feedback between devices and human fingers that would typically require a tactile interaction.
The project requires the radar to operate at higher power levels than what is currently allowed.
Project Soli -- the project announced at I/O 2015 by Google’s Advanced Technology and Projects group (ATAP) -- does not require a touchscreen, because it works without touching the screen. Radar senses the motion of the person’s fingers. A radar beam is emitted from within a chip-sized object with Soli capable of capturing and processing that fine movement. It works as an input method.
The radar senses motion and follows the direction of the fingers, such as a swipe up or down to search on a topic, or when the user brings together their thumb and forefinger to replicate the gesture of turning a knob to lower or higher sound on the device.
Google’s website describes Project Soli as having sensors in a chip that can track submillimeter motions at high speeds. The technology “captures the possibilities of the human hand” for virtual use to track “micro-motions.”
These motions can help Google interpret human intent, according to Patrick Amihood, software lead on Project Soli. The radar can sense the tiniest motions.
The wavier describes the radar-based technology as a way to “capture motion in a three-dimensional space using a radar beam to enable touchless control of device functions or features, which can benefit users with mobility, speech and tactile impairments."
Google in early 2018 asked the FCC to allow its short-range interactive motion sensing radar technology to operate in the 57- to 64-GHz frequency band at power levels consistent with European Telecommunications Standards Institute standards.