Welcome to Project Soli

My name is Ivan Poupyrev, and I work for Advanced
Technology and Projects group at Google. The hand is the ultimate
input device. It’s extremely precise,
it’s extremely fast, and it’s very natural
for us to use it. Capturing the possibilities
of the human hand was one of my passions. How could we take
this incredible capability, the finesse of human actions
and finesse of using our hand, but apply it
to the virtual world? We use radio frequency spectrum,
which is radars, to track human hand. Radars have been used
for many different things– to track cars, big objects,
satellites and planes. We’re using them
to track micro motions, twitches, of the human hand and then use that
to interact with wearables and Internet of Things and other computing devices. Lien: Our team is focused
on taking radar hardware and turning it
into a gesture sensor. Radar is a technology which transmits a radio wave
towards a target, and then the receiver
of the radar intercepts the reflected energy
from that target. The reason why we’re able
to interpret so much from this one radar signal is because of the full
gesture recognition pipeline that we’ve built. The various stages
of this pipeline are designed to extract
specific gesture information from this one radar signal
that we receive at a high frame rate. Amihood: From these strange,
foreign range Doppler signals, we are actually interpreting
human intent. Karagozler: Radar
has some unique properties when compared to cameras,
for example. It has very high
positional accuracy, which means that you can sense
the tiniest motions. Schwesig: We arrived
at this idea of virtual tools because we recognized
that there are certain archetypes of controls,
like a volume knob or a physical slider,
a volume slider, Imagine a button between your thumb
and your index finger, and the button’s not there, but pressing this
is a very clear action. And there’s an actual
physical haptic feedback that occurs as you perform
that action. The hand can both embody
a virtual tool, and it can also be, you know, acting on that virtual tool
at the same time. So if we can recognize
that action, we have an interesting direction
for interacting with technology. Poupyrev: So when we started
this project, you know, me and my team,
we looked at the project idea, and we thought,
“Are we gonna make it or not? Eh, we don’t know.” But we have to do it. Because unless you do it,
you don’t know. Raja: What I think I’m most
proud of about our project is, we have pushed
the processing power of the electronics itself
further out to do the sensing part for us. Poupyrev:
The radar has a property which no other technology has. It can work through materials. You can embed it into objects. It allows us to track
really precise motions. And what is most exciting
about it is that you can shrink
the entire radar and put it in a tiny chip. That’s what makes this approach
so promising. It’s extremely reliable. There’s nothing to break. There’s no moving parts. There’s no lenses. There’s nothing,
just a piece of sand on your board. Schwesig: Now we are at a point
where we have the hardware where we can sense
these interactions, and we can put them to work. We can explore
how well they work and how well they might work
in products. Poupyrev:
It blows your mind, usually, when you see things people do. And that I’m really
looking forward to. I’m really looking forward to releasing this
to the development community, and I really want them
to be excited and motivated to do something cool with it,

Leave a Reply

Your email address will not be published. Required fields are marked *