Skip to main content

Google Clips review roundup: Early days for AI capable of capturing some precious moments

After Google Clips was announced in October, I opined about how the company is pushing artificial intelligence into a high-stakes activity that needs to just work. Available today, the first reviews of the smart camera have surfaced and offer a varying array of opinions that overall suggest that it’s still early days for Google Clips.

The Google Clips hardware is very straightforward, with multiple reviews noting that it’s instantly recognizable as a camera. In addition to that evoking privacy concerns, it increases the chance of people either becoming camera-shy or straight out posing, as noted by The Verge:

One thing that I’ve discovered is that people know right away it’s a camera and react to it just like other any camera. That might mean avoiding its view when they see it, or, like in the case of my three-year-old, walking up to it and smiling or picking it up. That has made it tough to capture candids, since, for the Clips to really work, it needs to be close to its subject. Maybe over time, your family would learn to ignore it and those candid shots could happen, but in my couple weeks of testing, my family hasn’t acclimated to its presence.

Meanwhile, in real-world usage, reviewers quickly found that the camera requires a lot more hands-on management than Google’s introduction suggested. TIME notes:

My biggest struggle had to do with framing my shots correctly. Because Clips is best for recording subjects three to eight feet away, I had some trouble finding tables and surfaces that were close enough to my pets that also allowed me to position the camera at the right angle.

Google pushes back against the idea that it is a “set it and forget it” device, but admitted to Engadget that they’ve found parents who’ve left the device in a room. It speaks to a value proposition that isn’t immediately realized.

Meanwhile, this needed adjustment exposes issues with the quality of the camera as heavily highlighted in The Verge’s review:

The camera’s ultra-wide-angle field of view (it captures something similar to what a 10mm lens on a full-frame DSLR sees) makes it easy to position without a screen and be assured that you’ll get something in frame, but it’s bad for pictures of people, as it distorts facial features in an unflattering way. Likewise, anything near the sides of the frame is wildly distorted. Your subjects also have to be within roughly 10 feet of the camera, lest they be tiny in the resulting image. But the Clips’ fixed-focus lens has a range of about three feet to infinity, so nothing close to the camera is ever sharp. Even then, subjects within its range never really look sharp, either.

In terms of capturing highly prized moments, the reaction was not definite in either direction. Wired was able to capture great shots with the device.

It’s also good at sensing movement in pets. I pointed it at my cat and tried to get her to play with her feather toy. I dangled the toy around her head for a full minute while she sat there disinterested (she is a cat) before finally taking the bait. When I opened my phone, the only clip it saved was the money shot where she pounced on the toy. Perfect.

As was USA Today, after some initial training time and going through a sea of mediocre ones.

Once turned on, the camera shot many, many, clips, most with no rhyme nor reason. It just automatically captured, periodically. Most were pretty bad, some were great, and rather welcome, especially the shot of my friend’s 4-year-old daughter laughing and jumping up and down.

But as TechCrunch again notes, it is not perfect and requires more human input than what one would assume is needed for a product powered by AI.

Clips mostly does a good job capturing key moments. It’s not perfect, of course. And really, it requires a bit of good-old-fashioned human curation. That’s where you come in. Odds are you’re only going to end up sharing a fraction of the shots the camera ends up capturing.

The general consensus of the reviews is that Google Clips can eventually take good photos. It appears that an element of time, training, and general behavioral changes are needed. With the latter aspect, a more parent-focussed review from ZDNet notes an example of Clips snapping good photos and one therefore becoming more trusting of the AI.

The first few times I used it, I found myself continually going into the Clips app and looking at what had been captured, then trying to adjust positioning or using the shutter button to force a photo.

It wasn’t until after a basketball game when I was reviewing the various moments it had captured, I realized I needed to let go and trust Clips. I now have GIFs of my daughter playing defense and going up for rebounds, all the while I sat and watched the game without reaching for my phone to take photos (or check Clips).

However, some like The Verge ultimately never reached that moment where the usefulness beat out those hardware limitations.

I’m sure I could get better at using the Clips camera with practice — getting a better idea of its ultra-wide field of view, finding the best angles and positions for it, and so on — but I’m not convinced that the effort involved would be rewarded with great results.

Google is fond of saying that artificial intelligence is in its “early days.” Nowhere is this better exemplified with Clips as Engadget aptly summarizes:

Bottom line: Even if you have Clips set up and switched on when a special moment happens, there’s no guarantee the device caught it. At this point, only God and Google know what the AI catches, and you’ll have to live with the unpredictability.

At the same time, the “early days” mantra can also apply to the users of this new technology. It would be very interesting to revisit usage after the Clips AI has gotten the opportunity to improve and train, as well as when consumers have had a chance to get familiar with this new behavior of relying on a gadget to do the heavy lifting.


Check out 9to5Google on YouTube for more news:

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Abner Li Abner Li

Editor-in-chief. Interested in the minutiae of Google and Alphabet. Tips/talk: abner@9to5g.com