A Reflection on Jeremy Bailenson’s talk, “Transformed Social Interaction in Virtual Reality.”
In virtual worlds, people appear in the guise of avatars. These graphical representations can closely resemble the user – but they can also be radically or subtly transformed. These transformations can be apparent to all inhabitants of the virtual world, or they can be tailored to individual perspectives. With a series of ingenious experiments, Jeremy Bailenson has been studying the social and psychological effects of transforming avatar behavior and appearance.
In one experiment, Bailenson and colleagues blended the face of a viewer with that of a presidential candidate. The blend was subtle enough that the viewer did not detect it, yet the new resemblance to the candidate was effective: candidates thus transformed were perceived to be more familiar—and therefore more desirable—than candidates who were not altered.
In another, an avatar that had been programmed to maintain constant eye gaze spoke with the subject. Such persistent scrutiny is almost unheard-of in the real world – we typically look at the person we’re talking to only about 40% amount of time, or 70% of the time when we are listening. The intense gaze discomfited the subjects, but was at the same time, persuasive.
Other experiments focused on how one’s avatar affected one’s own behavior and perceptions. Subjects with attractive avatars felt and acted friendlier than did those who saw themselves portrayed by ugly ones. Such effects occurred even when only the subject saw the transformation: people negotiated harder and more successfully when they saw their own avatar as taller than another, even though their negotiating partner did not see the transformed height.
This work raises many ethical questions and forces us to articulate what, exactly, we mean by an “honest” representation – and when we actually want it.
During an election, candidates play different roles in front of different audiences. They may appear in plaid shirts and jeans to address a group of farmers, and jackets and ties for a dinner with corporate executives. They may even shift the cadences of their speech, e.g. adding a drawl in the South. Is this mimicry dishonest, or is it a reasonable way of expressing comradeship with the audience?
Mimicry is integral to our social interactions. In face to face conversation, we subconsciously express empathy and solidarity by mimicking each others’ verbal cadences and movements. This mirroring not only reflects the empathy between the parties, it also helps form it. However, this ordinarily subconscious and socially beneficial behavior can be deliberately exploited by someone who wants to seem amicably like-minded, but who actually has ulterior, if not predatory, motives.
One of Bailenson’s experiments showed that avatars programmed to mimic the subject’s gestures were more persuasive and well-liked than avatars using naturalistic but non-mimicking gestures. Is mimicry carried out via avatar simply an extension of the same social adaptability, or is it fundamentally different?
I would argue that the automatic simulation of mimicry it is fundamentally different, even from the most deliberate and calculated of face to face imitations. The candidate who copies the clothes and cadences of his or her potential voters, or the empathy-faking listener, must at least pay close attention to the actions of their audience and experience acting like them. When the mimicry is transposed to the virtual world, the person behind the avatar experiences no such affinity. The intimacy is purely illusory.
Yet before we relegate such socially smooth avatar behaviors to the category of inherently dishonest depictions, it is worth thinking about the alternative. If we are to have embodied online interactions – and the massive popularity of avatar-based places and games indicate they will be of growing importance – the avatars need to have some level of automatic behavior. If you want your avatar to move, you don’t want to be laboriously animating each step of its gait, you want it to have a walking algorithm. And, arguably, if you want your avatar to social, you don’t want to be laboriously animating each nod and gesture, you want it to have social interaction algorithms. The question becomes: where do we want to draw the line?— where does an algorithm help make the avatar experience come alive, and where do we want the active engagement of the participants to control this behavior?
Part of what makes Bailenson’s research so thought-provoking is that in reacting to the prospect of automated persuasion, we are forced to confront our beliefs and practices around simulated empathy in our everyday life. From the cashier’s cheery “have a nice day” to the waiter’s praise of our discerning menu choices, we enjoy the warmth of virtual friendliness. Much of the vast “service industry” is built on imitation camaraderie, and we complain bitterly when it is absent. For society to function, much faking is needed.
Bailenson’s experiments also touch on the illusions inherent in our relationship with our self. People who saw their own avatar as taller than others did better in negotiations – even though only they saw the height differential. People who saw their own avatar as attractive were more confident and friendly. People who saw their avatar get visibly fatter when eating were more successful dieters. These have fascinating implications, both exciting and disturbing, for our increasingly simulated lives.
Judith Donath is a Berkman Faculty Fellow and was the founding director of the Sociable Media Group at the MIT Media Lab. She is leading the Berkman Center for Internet & Society’s Law Lab Spring 2010 Speaker Series: The Psychology and Economics of Trust and Honesty. Judith’s work focuses on the social side of computing, synthesizing knowledge from fields such as graphic design, urban studies and cognitive science to build innovative interfaces for online communities and virtual identities. She is known internationally for pioneering research in social visualization, interface design, and computer mediated interaction.