Millions of people are now interacting in virtual worlds like Second Life and World of Warcraft using the guise of avatars. In these spaces, users can actually design their avatars to be subtly or radically different from who they are in real life.
And it turns out how people interact through their avatars – the signals they give one another through conversation and appearance – can tell us a lot about the choices and biases that inform our behavior in the real world.
Jeremy Bailenson of Stanford University’s Virtual Human Interaction Lab has been doing a number of experiments with people, avatars, and virtual worlds. As avatars become more common and more useful outside of gaming – people are already using avatars for virtual workplaces, customer service, and advertising – questions of ethics, trust, and honesty become significantly more important.
After all, it’s one thing if your avatar is casually conversing with, battling, or dating another avatar who might not be what he or she seems in real life. It’s quite another when corporations or political candidates realize that they can handcraft an avatar to take advantage of your biases and earn your trust for their own purposes.
Jeremy sat down with Judith Donath – who leads the Berkman Center’s Law Lab Spring 2010 Speaker Series: The Psychology and Economics of Trust and Honesty – to talk more about this fascinating topic.
…also in Ogg!
Watch the segment featuring the work of Jeremy and the Virtual Human Interaction Lab at Stanford
Watch Jeremy’s recent talk at the Berkman Center
Notes from the talk from Judith Donath
Photo courtesy of Flickr user bettinatizzy
Subscribe to Radio Berkman
__(‘Read the rest of this entry »’)