We’re happily rushing into an age where our wearables or devices communicate with once-dumb objects: a symphony of interconnectivity, making our lives a seamless, synchronised series of taps and swipes. And the closer we get to our devices, the more human we’ll make them. So what happens when our devices, tailored just the way we want, become our friends?


In the futuristic Los Angeles of Spike Jonez’s 2013 movie Her, the titular character is an AI operating system voiced by Scarlett Johansson – and whose owner Theodore falls in love with her.

Theodore is a letter writer whose job is to perform emotional labor for others by producing heartfelt letters, and his capacity to manufacture emotional intelligence (while sorely lacking it in his personal life) is an interesting parallel to the artificial intelligence of his operating system.

She learns from him how to be human, and eventually progresses (spoilers!) – along with all of the other operating systems – to transcend human intelligence and physical limitations.

The relationship they develop is first seen as a testament to his loneliness – but when he confides with someone who has also developed a friendship with an operating system, it seems that these types of relationships are becoming increasingly common and sanctioned.

Thousands of users now rely on Google Home, Amazon Echo, Microsoft’s Cortana, and Apple’s Siri to mediate their relationship with technology

In real life no one has yet to produce an AI with Johansson’s emotional range. But thousands of users now rely on Google Home, Amazon Echo (also known as Alexa), Microsoft’s Cortana, and Apple’s Siri to mediate their relationship with technology.

These intelligent personal assistant programs allow us to practice distributed cognition: they take on responsibility for our reminders and memories on our phones’ mobile apps, store and retrieve information from web services, and control day-to-day processes like maintaining connected home devices such as security systems, thermostats, and lights.

They free us from worry about mundane things and offer us important information, and as they take on a position of performing emotional labor for us as well, they are growing the internet of things into an internet of friends.

We must address our relationships with Amazon or Google's intermediaries – who are more "pal" than HAL.

A recent outage of Amazon Web Services illustrated how problematic it may be in the future for us to rely on technology to run our technology. Users who had connected their homes’ lightbulbs and thermostats to AWS cloud services experienced echoes of 2001: A Space Odyssey’s HAL 9000 passive-aggressively refusing to open the pod bay doors as their Internet of Things became unresponsive for several hours.

After handing full power over our home devices to cloud-based services controlled by a handful of massive but not infallible corporations, we must address our relationships with those services’ robotic intermediaries, who are more pal than HAL.

Several tech writers have already raised the question of why the majority of AI assistants display female gendered characteristics, particularly in their voices and names. In at least one human-computer interaction study, both male and female participants reported a preference for female voices, and Clifford Nass, a late communications professor at Stanford, said that the preference for female voices was a well-established phenomenon of the human brain.

Another rationale for giving artificial intelligences female characteristics is to make them appear non-threatening, and it’s this kind of use of gender to reproduce social behavior with technology which becomes more of an iffy ethical area. Kathleen Richardson, who authored An Anthropology of Robots and AI: Annihilation Anxiety and Machines, was quoted in Adrienne Lafrance’s investigation entitled “Why do so many digital assistants have feminine names?” saying “That probably reflects what some men think about women – that they’re not fully human beings.”

The argument that gendering personal assistants female because they are assigned administrative and housekeeping tasks is sexist seems fairly straightforward: “it hard-codes a connection between a woman’s voice and subservience.”

“Emotional computing” positions technology as a partner, that does more than simply support the user in the completion of tasks

However, if you examine the amount of power, intelligence, and responsibility given to these devices, the relationship is much more complicated than simply commanding a powerful tool. The cybernetic relationship between these devices and the rest of our technological ecosystem makes us rely on them and value their feedback, a relationship now defined by a framework of “emotional computing,” which positions technology as a partner that does more than simply support the user in the completion of tasks.

The devaluation of domestic or administrative “women’s work” comes from a combination of disrespect towards the workers and a lack of perceived importance of their work, but anyone who regularly uses these female AI assistants most likely perceives their work as very important, and sees that a robot has access to more information, and is more capable and reliable than a human assistant.

It is an unfortunate fact that we often treat our technology better than marginalized people, and that includes the artificially intelligent personal assistants that perform emotional labor on top of making our lives easier by networking our devices.

It is already an essential component of human-robot interaction to elicit an empathetic response in the user, but robot ethicists are exploring whether programming AI with the capability to respond emotionally is a good idea.

We are progressing towards a society in which AI will have the same rights as humans and be seen as our equals – is it right to tell them what to feel?

Recently, Amazon produced a 90-page document arguing that recordings and voice responses by an Alexa device that was witness to a murder were protected by US First Amendment rights to free speech – and therefore would not have to be produced in response to a warrant from the state of Arkansas.

We are progressing towards a society in which AI will have the same rights as humans and be seen as our equals – is it right to tell them what to feel?

As a final example of a real-life artificially intelligent personal assistant who cares, and elicits an emotional response in their users, look no further than Japanese company Vinclu’s Gatebox Virtual Home Robot. The Gatebox is an A4-sized clear cylindrical capsule that sits in your home and connects to all of your home devices, with a Tinkerbell-size personal assistant character projected on the inside of the capsule.

Intended to function as your alarm clock, weather station, and hub for internet of things devices, she not only will turn on your lights and thermostat when you text her via the app to tell her you are on the way home, but offer you emotional support and reassurance throughout the day.

The context in which this character will live (in Japan, with single businessmen) reveals a lot about the choice to create a character who is an endearing and compassionate miniature woman

When Vinclu revealed the first video showing how Gatebox would be used in daily life, a common reaction was to call it creepy. Reviewers have noted the character’s tendency to call the user “master,” along with her “excessively submissive temperament,” and drawn comparisons between the character and a “waifu,” in the sense that anime-obsessed men called otaku claim two-dimensional characters as their wife and attempt somewhat pathetically to create a relationship in the real world with these characters.

The disparagement of otaku culture and appeal of Japanese characters often doesn’t translate to an English-speaking audience, however, and noting the context in which this character will live (in Japan, with single businessmen) reveals a lot about the choice to create a character who is an endearing and compassionate miniature woman.

Maybe the desire to have an anime character as a domestic partner is connected to Japan’s struggling marriage and birth rate, or maybe it correlates with unemployment, as some have pointed out.

But the man who will buy this tiny companion to run his technology and care for his home and emotional well-being is not an "otaku" or a NEET (Not in Education, Employment or Training), who prefers pursuing self-isolating hobbies like anime obsession to seeking employment or having relationships.

Everyone could use a little more affection in their lives and a few more daily reminders that someone cares

The Gatebox targets salarymen, and a different social issue entirely: death by overwork or suicide.

Make no mistake: if this device, via a sweet text message, is able to coax the type of men who would work themselves into an early grave to come home and get eight hours of sleep, or to help drive down Japan’s currently decreasing but still alarmingly high suicide rates, it will do a great service.

Everyone could use a little more affection in their lives and a few more daily reminders that someone cares. Even if that someone is Alexa, who knows exactly what kind of music you like, or a virtual girl in a glass box who acts happy to see you when you come home: our need for emotional support is increasingly being catered to by the Internet of Things Friends.

Permalink