In an interview with Ellen Degeneres, Ryan Gosling announced that he intended to buy a companion for his robotic vacuum. When he listens to it cleaning alone in the night, Gosling explained, he pities the vacuum’s “tireless” efforts. “I feel bad for it,” he admitted. And so Gosling reached what is, to him, a reasonable conclusion: “I want to get a Roomba for my Roomba this year, so it has company.”
The empathy Gosling feels for his “lonely” Roomba is by no means unusual. It is the psychology of anthropomorphism – attributing human characteristics to non-human objects. Just consider why children draw the sun with a happy face or why we give names to our cars. We maintain a peculiar tendency to ascribe human qualities – such as emotion, fear and ambition – to inanimate objects. According to psychologists, this happens for three reasons.
- We tend to anthropomorphize when confronted with things that carry human-like attributes and resemble certain human features, like eyes. If you’ve ever seen a car outfitted with plastic eyelashes on the headlights, you’ve witnessed this phenomenon.
- We also have a desire to make the unfamiliar familiar by leveraging contextual indicators as cues to dictate an emotional response. After all, what’s more familiar than an object that is in some way made human? Is a car in a bad mood when it won’t start? By seeing the “humanity” in our surroundings, we can attempt to interpret the behaviors of non-human entities so we can understand and relate to them.
- Lastly, our desire for social interaction represents a powerful force that causes us to manufacture relationships with inanimate things. Social beings by nature, humans create friendships with objects or animals as a way to cope with loneliness. Growing epidemics in a connected world, loneliness and isolation are but a few of the factors contributing to the proliferation of new, anthropomorphic technologies, devices and services aimed at filling our social needs. Today, some people are ready to reject old models of relationships altogether and are actively looking for alternatives.
Natural Moments with Unnatural Partners
Increasingly sophisticated technologies are giving birth to AI-driven digital personalities, virtual beings, stay-at-home robots and other forms of synthetic companions. These new technologies allow users to control all the parameters of any given interaction to create “relationships,” which, to varying degrees, can meet a user’s practical and emotional needs.
Amazon’s Alexa and its counterparts, Apple’s Siri and Microsoft’s Cortana, are acclimatizing users to the idea that algorithms can simulate the experience of a personality. As users accept and rely on these “personalities” and the technology behind them continues to evolve, it is not difficult to imagine conventional human interdependencies, such as the Western nuclear family, being replaced by artificial mechanisms and networks of digital personas.
A more extreme example of a digital assistant is Gatebox. Created by the Japanese company Vinclu, Gatebox is poised to blend the personal assistant component of Alexa with the intimacy of a partnership. Gatebox is a holographic anime AI that “lives” in a user’s home. Its reach, however, moves beyond the domestic sphere, as the Gatebox character can interact with the user throughout the day via Gatebox’s chat application. Vinclu promises that the character will spend “natural moments” with users while channeling the essence of a trusted companion. The relationship between this AI and its partner is so blurred that on the Vinclu website, the main character offered with Gatebox refers to the user as her “husband.”
As old relationship structures are being tested, technology is allowing new ideas about companionship to be explored, even as the old are overthrown. Going forward, these shifting relationships will challenge dominant paradigms and lead to anthropomorphism being leveraged as a design principle to an even greater extent.
Designing for Humans
Anthropomorphism has long prevailed in business and political contexts. Advertisers use personalities, physical human features and voices to create concepts, brands and communications that are designed to be more empathetic and elicit an emotional response. For the most part, this intentional construction of personhood is successful: Millions create “real” relationships with products, and experience true brand love and loyalty. However, the psychology of anthropomorphism can have a negative effect on business as well. Dissatisfied users and consumers can project personalities, characters and intentions onto businesses – even entire industries – and craft powerful narratives that harm specific brands. Consider the discourse around Big Pharma, a term that has become so ingrained within consumers’ consciousness that it now conjures up the image of a villainous character.
Companies responsible for creating virtual assistants have a moral obligation to assess the values and worldviews of the personalities and physical features of the entities they sell. For example, will users be able to personalize their digital assistant to such an extent that they influence social regression (e.g., by creating overly feminized or subservient characters)? Could such a device be enabled to further behave in a manner that would be considered socially regressive?
Connecting With Others
It remains to be seen whether the myriad of emerging tech-based alternatives to human relationships fully address our psychological and physiological needs. What is clear is that our relationships, communities and social circles are being redefined and renegotiated. Just as our communities are expanding to include new types of families and groups, our relationships with technology are becoming more meaningful.
Technology has long been connecting us, and we are now asking it to connect with us. We are broadening our notion of what it means to live in an interdependent world where humans are just one part of a broader system that everyone and everything relies on. In connecting with and anthropomorphizing technology, we might ask ourselves if we are losing something that makes us human. In fact, we are not. We are only trying to create deeper, more meaningful relationships. What’s more human than that?
Contributors to this blog include Melanie Levitin, innovation strategist at Idea Couture, a Cognizant Company, and Jocelyn Jeffrey, a software developer who formerly worked at Idea Couture.
Subscribe to our newsletter and get expert insights straight to your inbox.×
SUBSCRIBE TO OUR NEWSLETTER✖
THANK YOU FOR YOUR INTEREST IN DIGITALLY COGNIZANT.
We’ll be in touch soon.