In 2000, a group of researchers at Georgia Tech launched a project they called “The Aware Home.” The collective of computer scientists and engineers built a three-story experimental home with the intent of producing an environment that was “capable of knowing information about itself and the whereabouts and activities of its inhabitants.” The team installed a vast network of “context aware sensors” throughout the house and on wearable computers worn by the home’s occupants. The hope was to establish an entirely new domain of knowledge — one that would create efficiencies in home management, improve health and well-being, and provide support for groups like the elderly.
Twenty years later, the smart home market this initiative foreshadowed is projected to be worth more than $150bn by 2024. This boom, made possible by huge leaps in tech development including artificial intelligence, has yielded an astonishing array of household IoT devices that monitor, interpret, predict, and adapt to our preferences and behaviors. At CES 2020, tech giant Amazon exhibited almost an entire smart dwelling, kitted out with its well known Alexa speakers and compatible smart kitchen appliances, lighting control, entertainment centers, furniture, and even cars. It was an “Aware Home” for 2020.
For many, smart assistants, security cameras, thermostats, and vacuums are quickly becoming baked into regular home life. And with several months of lockdown behind us our relationship with these products — including their knowledge of us — is more intimate than ever.
Yet, there is still a degree of nervousness obstructing more widespread acceptance of household tech. Some manufacturers have taken to giving away their “entry level” devices in order to overcome consumer reluctance. Much of this hesitation comes from a fear of compromise and exploitation.
Here are five reasons why some consumers are yet to fully commit to the smart home dream — and advice for businesses looking to build public trust:
Perhaps the most obvious cause of resistance is concern for privacy, and the fear that smart devices may encroach on our personal lives. Though we already reveal much of ourselves to big tech companies via our online buying habits, social media footprint, and search history (among other personal data emissions), there is something that instinctively feels more intrusive about being watched, listened to, and having movements or consumption sensed and recorded. There is evidence that it has a direct effect on our behavior. Compound this with reports of smart device data and recordings being subpoenaed in evidence, and families inevitably become skeptical about what — or who — they have invited into their home lives.
If brands want to build trust, they need to be upfront about the data they collect, when they collect it, who can access it, and by what process it is destroyed. Currently, a lack of clarity makes it easy for smart home devices to be reconceived as spying devices looking to learn and weaponize the most granular details about our lives — from what we say to when we take a shower, or turn out our lights to go to bed.
Even if the convenience value of smart home devices overrides immediate privacy concerns, many would-be users are still put-off by the opacity surrounding how their data is used — and with whom it is shared. Top information scholars have argued that our data is constitutive of our personal identity, meaning that “my data” is less like “my car” and more like “my hand.” Safeguarding data is thereby reconceived as protecting human dignity.
The researchers behind the Aware Home did not foresee this data-sharing problem; they thought there was a “clear need to give occupants knowledge and control of the distribution of [the] information.” Clearly anonymized personal and behavioral data now has enormous commercial value and yet, as data subjects, the general public has little awareness of if or how their information is traded. Explosive news stories like the Cambridge Analytica scandal have heightened consumer sensitivity to how much data is gathered and precisely who have access to it.
Contract transparency and accessible forms of truly informed consent would go some way toward empowering consumers in their understanding of data sharing. Similarly, companies purchasing third party data should be clear about their sourcing. In California, the CCPA gives more rights to data subjects, but for the desired effect these need to be clearly and responsibly communicated. It’s important to know who owns (or comes to own) what and how.
It’s no longer a secret that data harvested from smart home devices is used to make predictions about consumer likes, preferences, and other behaviors. At the user end, this process is usually referred to as “personalization” and marketed as a convenience, but on occasions the process of “typing” and categorizing that sits behind it has been exposed as problematic. Stories in the media have shown how our seemingly neutral online browsing history can be used to draw inferences about sensitive personal traits such as ethnicity, gender, sexual orientation and religious beliefs. This information has then been used to target or exclude groups from certain types of advertising or to adjust pricing.
Smart home data can be similarly revealing (when you are in/out, what you watch/listen to, when you eat and what, who visits and when, etc), and consumers wishing to resist being reduced to type are naturally hesitant about sharing more information about themselves — particularly if it might lead to unfair treatment or inflated pricing.
To overcome the underlying notion of: “the more they know about me, the more they can use that against me,” companies should be clear in their communication about the models and algorithms fueled wholly or in part by behavioral data. Businesses need to educate audiences as to how their household data footprint drives personalized services and advertising, and which data points are most salient.
Adjacent to concerns about personalization, is a similar worry about autonomy (or free will). As public awareness of data surveillance continues to grow, focus has turned to tactics used by tech companies — via smart devices — to use data insights to “nudge” consumer decision-making. Whether it’s by setting the default vendor for purchases by smart speakers, developing kitchen appliances that reorder groceries, or just offering predictive home lighting, AI-enabled tech gadgets are calling more of the shots on behalf of the consumer. As homeowners get comfortable with this, their affirmative feedback allows systems to grow stronger in their predictions.
On the face of it, this is another example of tech companies providing new levels of convenience through personalization, but for some critics the underlying algorithmic technology has tipped over from merely predicting consumer preference, to actively determining it. By shepherding users into strongly predictable habits of consumption, they warn that these systems are eroding free choice and undermining some of the basic tenets of human autonomy.
If our choices are integral to our characters, then by taking on a greater number of our decisions technology companies are now shaping our identity as people.
No company wants to be seen as manipulative or coercive. By focusing communications on explaining “opt outs” and introducing opportunities for users to change and diversify their habits, home tech manufacturers can cultivate positive outcomes for users that meet users’ conscious personal needs rather than commercial objectives.
Regular media narratives about massive security breaches have heightened public concern about how vulnerable their home tech could be to hacks and leaks. With personal information and even physical safety on the line in the event of a compromise, no family wants to invite an exposed device into their living space. The growing number of smart products emerging into our homes and workspaces has created an increasingly attractive target for cybercriminals. Add to this some of the recent horror stories about hackers accessing internal home security cameras, and anyone can see why prospective users need to be reassured.
Home tech companies should provide clear advice on how regular consumer households can practice good security hygiene when it comes to their smart devices. IoT manufacturers are often accused of thinking of security as secondary, but it should be a communication priority for those looking for public trust and household ubiquity.
Smart homes seem like an inevitability, but the strong headwinds created by a new and intensive focus on responsible tech will just as inevitably act as a filter — and only brands that have spent time anticipating potential harms and considering user concerns will be prepared for future scrutiny. If smart home tech is truly about creating new levels of comfort, then increasingly this should take on a more holistic relevance that encompasses the psychological comfort that comes with knowing and understanding the network and technology that unlies each device.