How Google designers create sounds for Pixel


You know the noise your Pixel makes when you take a photo? While it may seem like a camera shutter, it’s actually the sound of a pair of scissors opening and closing, says Conor O’Sullivan, who leads Pixel’s sound design team. “It’s not just a straight recording of that action, but that’s the basis of the sound that you hear today,” he says.

When designing how your Pixel sounds, Conor and the Pixel sound design team aren’t just trying to get your attention; they’re conveying a message with sound. “Sound design is essentially the art of creating intentional sound, and doing it with context,” Conor says.

Take a look (and a listen) at how the Pixel sound design team finds inspiration, develops, tests and implements the sounds your Pixel makes.

Why we need sound design

One of the reasons sound design for devices is increasingly necessary is that our eyes are doing so much work. “There’s so much information that can fit on a smartphone screen, but we don’t want to overburden any one of the senses,” Conor says.

Sound can help communicate more information — for instance, Pixel uses different sounds to differentiate between a text and a call, or sending an email and receiving one. The sound your phone makes when it sends out an AMBER alert probably triggers some kind of adrenaline spike, but not the same way a notification about an incoming email would. “Sound is great at conveying emotion,” Conor says. “It has a positive inflection or negative inflection, or gives you a sense of urgency.”

There are different categories of sounds that designers develop for Pixel: There are gesture feedback sounds, which confirm user action in response to UI elements — for example, swiping to archive an email in Gmail. There are semantic feedback sounds, which confirm user selection and progression of action, like turning on Pixel Battery Saver. There are also attention sounds, which are sounds Pixel creates on “its own” — like an alarm. The team pays attention to form and function when designing both: The form is the more “aesthetic” side — how it makes people feel, and whether it has a playful, human sense to it. The function is what it’s supposed to do (wake someone up, tell someone a timer is complete) and how well it does that (can you hear your alarm from another room or if your phone is in your bag?).

Sound designer Harrison Zafrin worked on the sounds for Guided Frame, the Pixel camera tool that helps blind and low-vision users take selfies. It’s a great example of balancing form and function: The app combines haptics (or tactile feedback) and sound to help people guide their hand and the camera to the right position. “We came up with this system of five different zones around the viewfinder and as you get closer to the center you hear an increasingly positive musical progression, what’s called a diatonic progression. You get this sort of celebratory sound when you’re centered,” Harrison says. “I think we did a really good job using the sound and haptics to be delightful and Googly but also communicate the right information so people can get the photo.”



Source link

Share:

Leave a Reply

3 latest news
On Key

Related Posts

How AI can enhance customer service

Virtual assistants with a conversational presence have been shown to increase trust — and consequently also engagement — compared to purely text-based chatbots. Virtual agents can

Solverwp- WordPress Theme and Plugin