Our smartphones are full of personal information and sensitive data, and sometimes, we have to pull up info that we don’t want others to see. However, in a public setting, it’s not all that difficult for someone to look over your shoulder and see everything you are. Now, a couple of Google researchers have figured out a way to detect when someone is looking at your phone.
Hee Jung Ryu and Florian Schroff, two researchers at Google, developed a method to use the front-facing camera on a smartphone (in their case a 2016 Pixel) to detect not just when another person’s face is in view, but when their eyes are actually looking at the phone’s display.
In an early test version of the functionality shown off to Quartz, the app recognizes that someone is looking at the screen and immediately switches from the Hangouts app to a camera viewfinder which highlights the person looking at your phone. Once they turn away, it switches back to the previously open app.
While this is obviously still in a rough state, it seems to work really well. The developers claim that this will work in most lighting scenarios and can recognize someone in just 2 milliseconds. They also state that it can work that quickly because the entire process is running on the phone rather than using cloud-based servers. That’s thanks in huge part to Google’s TensorFlow API as well.
The “E-Screen Protector” project will be presented more in-depth at the Neural Information Processing Systems conference in Long Beach, California next week.
FTC: We use income earning auto affiliate links. More.
Comments