OK smart speaker! Why should I trust you?

smart speaker illustration

“Hey, Google! Can you turn the temperature in the bedroom down to 65 degrees?”

Seconds later the air conditioning kicks in and I drift off.

It still floors me that I can talk to a device to control my home environment, get help with a recipe (“Hey, Google! What’s the internal temperature for a medium-rare roast?”), set the mood (“Hey, Google! Please play the focus playlist!”), get random information (“Hey, Google! What’s the name of that actor who played the bad guy in Die Hard?”) and keep me in line (“Hey, Google! Can you remind me at 3pm to call my mother?”). We buy every new smart device that comes along so that we can automate our home more and more.

But is our enthusiastic embrace of smart home devices the equivalent of inviting Big Brother into our home?

According to a few opinions, it is.

In 2018, there were at least two separate incidents of smart home devices recording people when they hadn’t used a “wake” command, and a recent controversy over company employees watching home security camera feeds. All of these incidents spurred me to check my activity logs, which, thankfully, didn’t bring up any irregularities.

In spite of all of the hand-wringing, there are 26 million of these devices in American homes and, according to Forrester’s research, this number is expected to grow to 66 million by 2022. And even though I’m a bit wary of our smart home speaker because of these reports, I am hooked. In fact, I’ve told my husband that I wouldn’t mind getting a few more for other rooms so that I don’t have to yell commands across the house.

It’s not that I’m uninterested in protecting that privacy, it’s that I know that I handed it over a long time ago.

Don’t get me wrong; I’m aware of the risks of inviting these devices into my home. In fact, we’ve made sure to go into our settings to ensure maximum protection and check our activity on a regular basis so that we’re aware of what is being recorded and stored. I know what I’m giving up by having the device and I very much value my privacy. It’s not that I’m uninterested in protecting that privacy, it’s that I know that I handed it over a long time ago.

I sold my privacy for convenience…with a caveat

In the early 2000s, I read David Brin’s The Transparent Society, in which he predicts the situation in which we’ve found ourselves today: where we’re forced to rescind our rights to privacy in order to function in society.

I’m not going to make the argument that as long as we’re doing nothing wrong, we should be unconcerned with this situation. That’s a naive statement, to say the least. That argument only works in a rational world where our governments are transparent and work for the people and corporations have our best interests at heart. We all see regimes around the world where that’s switched in an instant.

But having made the choice of convenience doesn’t mean that I shouldn’t also ask for rights as a customer. The companies making mobile phones and smart speakers are not malicious in their intent. Their main goal is to make money, which is the rational goal for any for-profit organization in a capitalist economy, but they also know that they need to build trust with their customers in order to continue to make money.

And this is where consumers still have the power.

My data is yours, but you’ll have to keep earning my trust

In the past two years, Facebook demonstrated the perils of breaking trust. For years, it seemed that Facebook was unstoppable in its growth, with consumers and businesses alike throwing more and more of their time and money into the platform every day. Critics cautioned, “If it’s free, you’re the product,” and even that wasn’t enough motivation to stop. Everyone was okay with being the product as long as the benefits outweighed the costs.

But in 2018, this drastically changed for Facebook. The change in public opinion was best demonstrated when Facebook tried to launch their own smart home device. The launch reviews were clearly aimed at the lack of trust that everyone had in Facebook: “great device, but I’ll pass” and “what were they thinking?” The costs now seem to eclipse the benefits. The numbers haven’t been reported as of yet, but it’s likely that the timing of the launch led to lower-than-expected sales. As the #deletefacebook movement grows, it becomes more and more crucial for them to rebuild that trust.

So what can future device builders and service providers do to ensure that they build and keep trust so that their customers feel safe to share their data?

TRUST is the ultimate marketing tool

To take back more control over their privacy, some interesting hacks are being created. Futurithmic guest and thought leader, Galit Ariel shared an innovative consumer-prototyped device that helps users control their home devices.

Project Alias by Bjørn Karmann and Tore Knudsen allows you to “hack” your smart speaker.

Project Alias is an open-source smart speaker hack hosted on GitHub and built with Raspberry Pi. It can be built at home by pretty much anyone at no cost. The physical shell pattern is provided for 3D printing and the components are all mapped out in fairly simple steps. Though it’s a project more meant for the tech-savvy, I imagine that it won’t take much time for a mass commercial product like this to emerge.

Other smart home device makers have made a spectacle out of their promise to consumers around privacy (even if it’s not entirely accurate), demonstrating that they are well aware of the pressure they’re under to build trust.

For tech companies and service providers looking to fill those next 40 million homes with smart devices: privacy, transparency and trust are going to be your number one marketing tool. Let’s examine what that looks like in action.

Step #1 – Simple, clear communication

Every customer has a different threshold of tolerance when it comes to privacy concerns. Some are okay with their data being stored so that their recommendations improve, while others want to have a full-time incognito mode enabled.

Though the EU Cookie Law is seen by some as being an imperfect implementation, the idea behind it is pretty sound. You need to be transparent about the data you’re collecting from each visitor; they should have the option to opt-in or out of this data, and; your site should still operate at a basic level without collecting that data.

As a web user, I find most cookies more annoying than helpful and just click OK to move on (but only when I trust the site). However, this isn’t the only way to create transparency and trust with your users.

When signing onto a new service or setting up a new device, it should be standard to offer up options for opting in or out and information that is clear about what those options mean. For instance:

Step #2 – Increase the benefits of sharing data

Once you have customers onboard, keep them interested by offering those that opt-in great data visualizations and education about their overall use. This becomes an enticement for other customers to opt-in.

Spotify, for example, sends their customers a year-end snapshot of their listening history and recommended playlists that many take to social media to share.

I also used to enjoy my annual travel data snapshots from Dopplr, an early social travel application.

2008 Dopplr report
Alper Çuğun on Flickr

Seeing the benefits of sharing my data made my decision a no-brainer. And if I saw someone in my network sharing their cool report, I may have to opt-in myself (as long as there isn’t a breach of trust).

Step #3 – Don’t abuse the trust

Data is valuable and can be monetized, and companies need to monetize to continue to exist, I get it. I understand that the social networks I use need to serve me ads to continue to offer me a free platform, and my smart home devices are inexpensive because being in my home means that they will be able to serve up ad units that will be even more valuable than a PPC ad someday.

But I don’t consent to them selling me down the river to potentially nefarious practitioners. I don’t consent to them listening to me 24/7 and I don’t consent to them using predictive AI that could hurt me and my family.

At the end of the day, respecting privacy, being transparent and clear about what you’re collecting and why, offering up benefits for that data back to the users (beyond better ad units!) and making decisions to prioritize building trust will be the strategic focus of the companies that we let into our homes and lives for years to come.

Information is power, but Uncle Ben taught us, “With great power, comes great responsibility.”

About Futurithmic

It is our mission to explore the implications of emerging technologies, seeking answers to next-level questions about how they will affect society, business, politics and the environment of tomorrow.

We aim to inform and inspire through thoughtful research, responsible reporting, and clear, unbiased writing, and to create a platform for a diverse group of innovators to bring multiple perspectives.

Futurithmic is building the media that connects the conversation.

You might also enjoy
daniel robbins futurithmic illustration
Podcast Episode 2: The State of AR/VR Art