Google Pixel: Button without a Button

In today’s digital era, it sometimes senses like hardware has grabbed a back seat to the software that drives our devices. The button of the Month is a monthly column that examines the physical pieces of phones, tablets, and controllers we interact with daily.

The Pixel 2 is an almost five-year-old smart mobile, but it introduced a feature we increasingly overlook each year. It was called Active Edge, allowing you to summon Google Assistant by squeezing your phone. In some ways, it’s an unusual idea.

But it gave you something sorely lacking on modern phones: a way to physically interact with the phone to get something done.

Looking at the Pixel 2 and 2 XL flanks, you won’t see anything to demonstrate that you’re holding anything special. There’s a power switch and volume rocker, but the sides are sparse. Provide the phone’s bare edges a good squeeze, and subtle vibration and animation will play as Google Assistant pops up from the base of the screen, ready to start attending to you. You don’t have to wake the phone, long-press on any physical or virtual buttons, or tap the screen. Instead, you squeeze and start talking.

We’ll talk about how reasonable this is in a second, but we don’t want to gloss over just how cool it senses. Phones are inflexible objects made of metal and plastic, yet, the Pixel can tell when we are applying more pressure than we are just holding it. It is made attainable by a few strain gauges mounted to the inside of the phone that can glimpse the ever so slight bend in your phone’s case when you press it. For the record, this is a transformation our human nervous system is incapable of picking up on; we can’t tell that the phone is bending at all.

Whether you found Active Edge helpful probably came down to whether you liked using Google Assistant, as illustrated by this Reddit thread. We used a voice assistant daily only when we had the Pixel 2 because it was right at hand. What made it so convenient was that the squeeze always worked. Active Edge still did its job even if you were in an app that hid the navigation buttons or your phone’s screen was thoroughly off.

While that made it highly useful for looking up fun facts or doing quick calculations and conversions, we’d argue that Active Edge could’ve been much more helpful had you been able to remap it. We enjoyed having the assistant, but if we had been able to turn on our flashlight with a squeeze, we would’ve had instant access to the essential features of our phone, no matter what.

This version of the feature existed. HTC’s U11, which reached out a few months before the Pixel 2, had a comparable but more customizable quality called Edge Sense. The two organizations worked together on the Pixel and Pixel 2, which explains how they ended up on Google’s devices. That same year, Google purchased HTC’s mobile division team.

Active Edge was not Google’s first attempt to provide an alternative to employing the touchscreen or physical buttons to control your phone. A few years back, with the Pixel 2, Motorola was allowing you to open the camera by twisting your phone and switch on the flashlight with a karate chop — not unlike how you shuffled music on a 2008 iPod Nano. The camera shortcut came about during the relatively short time that Google-owned Motorola.

As time went on, though, phone factories moved further away from being capable of accessing a few fundamental features with physical activity. Take my daily driver, an iPhone 12 Mini, for instance. To launch Siri, we have to press and hold the power button, which has become burdened with responsibilities since Apple got rid of the home button. To turn on the flashlight, something we do multiple times a day, we have to wake up the screen and tap and hold the button in the left-hand corner.

The camera is slightly more convenient, accessible with a left swipe on the lock screen, but the screen still has to be on for that to work. And if we are using the phone, the easiest way to access the flashlight or camera is through Control Center, which involves swiping down from the top-right corner and trying to pick out one specific icon from a grid.

In other words, if we look up from our phone and notice my cat doing something cute, it may have stopped when we get the camera open. It’s not that it’s difficult to launch the camera or turn on the flashlight — it’s just that it could be so much more convenient if there were a dedicated button or squeeze gesture. Apple even briefly acknowledged this when it made a battery case for the iPhone with a button to launch the camera. A few seconds preserved here or there add up over the lifetime of a phone.

To prove the point, here’s how fast launching the camera is on the iPhone versus the Samsung Galaxy S22, where you can double-click the power button to launch the camera:

Gif showing an iPhone’s camera being launched with the Control Center shortcut and a Samsung S22’s camera being found with a button press. The S22 launches its camera a second or two faster than the iPhone.

There’s less thinking involved when you can press a button to launch the camera. Neither phone handles screen recording and previewing the camera very well, but the S22 gets its camera app open before we’ve even tapped the camera icon on the iPhone.

Unfortunately, even Google’s phones aren’t immune to the vanishing of physical buttons. For example, active Edge stopped showing up on Pixels with the 4A and 5 in 2020. Samsung has also done away with a button it once included to summon a virtual assistant.

There have been attempts to add virtual buttons that you activate by interacting with the device. Apple, for example, has an accessibility feature that lets you tap on the back of your phone to launch actions or even your mini-programs in the form of Shortcuts. Google added a similar quality to Pixels.