You know what a digital camera is. It’s a lens and a sensor, with a display to see what you’re looking at, and a button to take the picture. Google Clips is a camera, but it only has some of those parts. There’s no display. Thereâ€™s a shutter button, but it’s completely optional to use. Instead, it takes pictures for you, using machine learning to recognize and learn faces and look for interesting moments to record.
It took me a while to wrap my head around Google Clips. It didn’t land for me until Eva Snee, the user experience lead for the camera, told me a story about a little moving photo I was looking at. To me, it was just a couple of toddlers. To Snee, it was something very different.
“It’s one of my favorite photos,” Snee says. “This is my son, and this is my nephew. We were having a family vacation, and they snuck outside and started reading a book together. I went outside with my phone and they stopped.” So she set the Google Clips camera down in front of them and walked away. “I went back inside with the adults, and I got all these amazing photos of these little moments where they’re together. Things I wouldn’t get, I couldn’t get. I tried.”
Snee’s anecdote encapsulates the two ways to look at this little thing: it can capture endearing, heartwarming moments â€” but only if you’re okay with a camera that is always watching, looking for something to record.
I don’t know if parents â€” Google’s target market â€” will want it. I don’t know if Google can find a way to explain everything it is (and isn’t) to a broad enough audience to sell the thing in big numbers, especially at $249. I also donâ€™t know what the release date will be, beyond that it will be â€œcoming soon.â€
But I do know that it’s the most fascinating camera I’ve used in a very long time.
Google Clips looks like a camera. That’s a silly thing to point out, but it’s also essential for what Google is trying to do to make this thing approachable and reduce any privacy concerns you might have.
It’s a flat little thing designed to drop into your pocket or purse. Two inches square, the camera lens juts out just a little. Looking at it, it’s sort of like an Instagram icon popped off your phone’s screen and became a physical object. It only comes in one color pairing: white on the front and teal on the back.
Here’s how it works: you turn it on by twisting the lens, then you set it down and forget about it. Clips then watches everything it sees in its 130-degree field of view, and records little seven-second moving images of stuff it finds interesting. It learns faces over time and tries to take more photos of those people and fewer photos of strangers. It can also recognize pets.
As a camera, it’s a little weird to not have a viewfinder â€” as weird as not needing to hit a shutter button or even hold the camera. In fact, although you can hold Clips, itâ€™s best when you just set it down somewhere in the room while you and your family go about doing whatever it is youâ€™re doing. It also comes with a silicone clip that wraps around the camera at any angle, so you can stand it up more easily or clip it to stuff.
You would think the thing Google wants you to clip it to is yourself â€” and of course you’re certainly free to do that â€” but Juston Payne, the product lead for Google Clips, says that’s not really what it’s meant for. “We find that wearing it, just to say it bluntly, is not a good way to get good content,” he says. “Mainly because it requires that you act strangely. Frankly, what I mean by that is like you then have to position your body.”
Later, when you feel like it, you can open an app on your phone and scroll through all the little videos it recorded (though “videos” isn’t quite the right word, because there’s no microphone). You can swipe on them to either save them to your phone or delete them, and tap a few buttons to filter out what the system thinks are less interesting clips. Tapping into a clip lets you trim the video to the loop you want or pick out a specific still image to pull out. You can export them as Motion Photos, GIFs, JPEGs, or movie files.
You can also take a more active role with the camera. There’s a shutter button on the front, under the lens, and tapping it takes a clip. The app also allows you to turn on a live preview, where you can see what the camera sees and remotely hit the shutter button.
“We used to not have a button,” Snee explains. “And then we put a button in because we learned from real users â€” outside of Google â€” that a camera needs to have that agency.”
You can’t talk about Google Clips without your very next breath being about the elephant in the room: is it creepy? I think the answer is no, for a few reasons. The first is that, as a physical object, it’s basically adorable, and it signals that it is a camera so clearly. When it’s on, there’s a blinking white LED that indicates to you that it might be taking photos.
Payne says that’s intentional. “It looks like a camera. It’s pretty obvious. It’s designed to be playful and approachable in its design. It was never a goal of ours to make something that blends in.”
But design aside, the main reason Google Clips isn’t as worrying as “Google camera that recognizes your family’s faces and records them automatically” sounds is that Google made a few carefully considered technical choices to protect its usersâ€™ privacy.
The first is that everything on Clips happens locally. Nothing is synced with Google’s cloud at all â€” except the photos you save into Google Photos. All the facial recognition happens on the device using its own processing power. None of it is paired up with whatever facial recognition you may have set up in Google Photos. It doesn’t pair faces with names, it just recognizes faces it sees a bunch over time. It also tries to ignore faces it doesn’t recognize. So if you’re at a park with your kids, Clips will endeavor to only take photos of your kids.
The clips the camera takes are also stored only on the camera itself. They don’t try to sync over to your phone unless you ask for them. They’re also encrypted on the camera, in case you lose it.
Google Clips pays attention to stuff that is â€œinteresting.â€ It thinks interesting things are faces and pets it knows; it tries to make sure it only takes good shots of those things. “It starts with saying, is there a face? Is this a face that I know?” Payne says. “Does this face have certain attributes that are going to be good? Eyes open, smiling, those sorts of things. It’s also then saying, is this a well-lit shot? Is this blurry?”
But it goes a little further than that. “It’s a sort of concept of ‘differentness,'” Payne explains. “It will look for changes over time. It tries not to give you the same thing over and over.”
Google Clips should be good for about three hours of active use, according to Google, though that will increase or decrease depending on whether or not there’s anything interesting happening. Yes, really: battery life depends on whether or not it’s seeing anything picture-worthy. “It gets bored,” Snee quips.
As for tech specs, they’re sort of besides the point on a device like this, but here you go: it has a 12-megapixel sensor underneath a 130-degree field of view lens. It’s capable of taking 15 fps bursts of images (and, again, that’s the only thing it takes). It has 16GB of storage, which doesn’t seem like much, but it was able to store two full days of clips it had recorded when I used it. Standby time should last on the order of “days.” And again, there is no microphone.
It transfers everything over Wi-Fi direct, which works pretty seamlessly if you’re using an Android phone; it requires you to select the camera’s hot spot manually if you’re using an iPhone. I’ve seen images transfer over to the app quickly, and I’ve seen it happen really slowly â€” all on preproduction software. Hopefully the final shipping software will be more like the former.
To start, it will only pair with Pixels, iPhones, and the Samsung Galaxy S7 and S8. Google says support for more phones will come over time.
The camera produces images and clips that look pretty good, though not quite as good as what you can get from your phone’s camera. For one thing, the angle is wider, and for another, they’re less likely to be perfectly framed because you usually aren’t looking through the live preview viewfinder to line up your shot.
But the clips are nevertheless compelling, even though most of what I got were pictures of people shooting video of the camera. (We were at a shoot for this story, after all.) But everybody who has been playing around with moving pictures over the last few years is right: they’re more compelling than still photos, especially when they’re of people you care about.
Google is explicitly marketing this camera to parents. That makes a lot of sense: instead of being a machine that takes the highest-quality photos, it’s a machine that takes the photos they wouldn’t have had a chance to take.
It’s also getting marketed to the most twee of demographics: pet parents. “We’ve trained our models to be really good on cats and dogs,” Payne says. Though he points out that after visiting a petting zoo, he can also boast: “I’ve seen it succeed on goats.”
Ultimately, Clips is fascinating because it’s so difficult to categorize. It’s not a GoPro or an action cam. It’s not a security camera. It’s not as good as your phone at taking high-quality pictures. It doesn’t have a viewfinder, and the shutter button is literally an afterthought. The AI is ultimately making most decisions for you.
Itâ€™s the first standalone camera I’ve seen that was designed around a fundamental truth: we look at photos on our phones, not prints or laptop screens. And phone screens do stuff that those other screens can’t, like move and share.
That makes Google Clips something genuinely new. And even though we are being inundated with tech on a daily basis now, it’s usually derivative. Something new demands our attention.
More from Google’s huge hardware announcement
Sup. Producer: Sophie Erickson
Director: Vjeran Pavic, Tyler Pina
Editor: Becca Farsace
Camera: Ben Williams
Photo: James Bareham
Audio Mix: Andru Marino