[Repost] Imagined, but somehow real
Virtual Reality affects how we see ourselves and others. What if you had a tail? What if you were homeless? Researchers have something to say about it.
Read time: about 12 minutes. This week: Virtual Reality and Augmented Reality are maturing. If you look beyond the hype, they are real advances. What if you had a VR tail? Can VR affect empathy?
Share this post. Someone else might like it, too!
During the first days of October 2023 I’ll dive into “Extended Reality” (or, more commonly called “XR”). My seminar on “our complex relationships with technology” kicks off next week with a virtual reality field trip at Duke’s MPS Lab and follows a week later with a guest who’s deeply engaged with virtual reality in medical training (
, who wrote a guest post here in 2022 called “Technological rubble”). And on Thursday next week, I’m moderating a panel at Duke’s “Emerging Pedagogies Summit” about XR at the university — four creative faculty members from across the institution will discuss, and I hope banter a bit, how virtual reality (VR), augmented reality (AR), and mixed reality (MR?) figure into their teaching and research.Emerging technologies are often disputed technologies, and XR technologies are no exception. A couple weeks ago, Meta was ridiculed for finally giving legs to the lonely VR avatars previously condemned to float around in the bereft metaverse — a feature previewed about a year ago. Now they can walk, I guess, though mostly on lonely strolls. Lots of people doubt that workers will accept wearing a VR headset very long in their virtual offices, though HR might think it’s a good idea. Soshana Zuboff warns that XR devices can be masterful collectors of personal data, and even pull Pokemon into the service of surveillance and manipulation. Substack writer and engineering professor at Harvey Mudd College
reminds us that human contact is needed in teaching and in Real Life.But others blunt the critique by exploring some of the powers that XR technologies can offer. This post from exactly a year ago reflects on last year’s experience of VR and rounds up a couple of interesting papers that show some of the imaginative and evocative power of XR.
Perhaps in the future, once some basic technical issues of XR and its application in society have been resolved (or made manageable) teachers and researchers can reach for the XR devices as we reach for books today.
Imagined, but somehow real
The first time I experienced virtual reality (VR) was about five years ago. Actually, today you might call it a virtual virtual reality experience, because it was a McGyver’ed affair. I pulled my smartphone out of my pocket, loaded an app, and inserted the phone into a flimsy cardboard case that resembled the kind of SWAG vendors hand out at trade shows. A flimsy elastic band wrapped around my head may have held the thing to my face, too — I can’t recall exactly. As dodgey as the contraption was, its affect was amazing to me, though no one would call its VR photorealistic or seamless. My phone’s pixel count and screen refresh rate couldn’t help but make graphics rudimentary and “jerky.” Still, I do recall stumbling a bit as I tried to lean on a VR table. Of course, it wasn’t there in real life.
That was my VR experience up until this week, when my students and I converged on Duke’s “MPS Lab” (Multimedia Production Studio). We sampled state-of-the-art VR headsets — Vive Pro 2 models. It was quite an upgrade from my earlier cardboard-and-smartphone model.
The cardboard model, despite its shortcomings, primed my experience, and that was for the best. Our VR tour guide told us that he agreed with a future seminar guest that more than a single visit to the lab might be useful. “The first time you use a headset,” he said, “you have a ‘Whoa!’ experience. A second time lets you think about it more critically.” When you’re plopped into a different world, you need some familiarity with the oddness of the experience before you gain “critical distance.” Replacing our Real World with a VR world certainly, but only temporarily, unsettles us.
Collaborators on a large-scale project at Stanford University summed up the jarring “replacement” of reality that VR systems accomplish: “VR systems block out the perceptual input from the real world and replace it with perceptual input from a virtual environment that surrounds the user, is fully responsive to the user’s actions, and elicits feelings of presence.” Even with my cardboard viewer, the symmetry of my movements and the VR-created environment was just good enough to allow a certain willing deception. This week’s VR journey was even more realistic. I could see “my” robotic hands in front of me, floating as they did in the virtual space, and I could manipulate them as if they were my own, the ones attached to my real arms and body. Within the VR world of the moment they also were my own, in a manner of speaking. I turned my head, and the virtual world shifted, just as it should.
Body transfer illusion. Body ownership and control.
You may have seen the video of different participants in an experiment to test what’s called the “body transfer illusion.” This specific experiment is called the “rubber hand illusion” that was reported in 1998 and has since been studied more deeply. The setup is simple: a participant’s left hand is hidden behind a barrier and a fake hand is placed in view, where the participant can see it, sitting as though it were his or her left hand. The experimenter strokes fake and real hands simultaneously, until the participant feels as though the fake hand is a real part of the body. The eyes deceive.
And the deception is particularly powerful, even though participants know quite well that the hand is fake. Take a look at this Facebook video to see participants react when the fake hand is threatened.
VR systems work their magic on our perception because of the plasticity of human perception; our brains accept our senses and assemble their signals into experience. VR replaces the content of our senses and lets our brains assemble a coherent, even very plausible, virtual world. In the process, we inhabit a different “body.”
The body transfer illusion of course opens the possibility of “stepping into another person’s shoes,” as the old adage goes, and researchers have used VR to explore aspects our our consciousness and emotions. Their work has revealed some of the mechanisms that work in our perceptions of how we “own” and “control” our bodies — plumbing our internal experiences with our embodiment. Others have sought to measure the ways that VR can change our judgments and perceptions of others, too, which is in keeping with a widespread notion that virtual reality systems are the “ultimate empathy machines.”
Get more independent writing. I recommend The Sample. They send one newsletter sample a day. No obligation. You can subscribe if you like it.
What would you do if you had a tail?
I can’t help but think that the lab at the University College of London was thinking about James Cameron’s Avatar (2009) when they were cooking up the study that they reported in “Human Tails: Ownership and Control of Extended Humanoid Avatars.”
William Steptoe, Anthony Steed, and Mel Slater wondered how perceptions of “body ownership” and “body control” would differ between participants with a virtual tail that they could learn to control — think hip movements — and others with a tail that was not under their control. They found that participants with controllable tails “quickly learn how to remap normal degrees of bodily freedom in order to control virtual bodies that differ from the humanoid form.” And they had a “higher degree of body ownership and agency.” They also were more anxious when their virtual tails and bodies were threatened.
It’s a sort of nerdy body illusion experiment with an Avatar movie flavor. But it gets better.
The UCL team devised a full-body game for the experiment. They didn’t use a headset-based system, opting instead for a “CAVE-like system” — a four-walled set-up with projections that completely surround the user. Participants used their tailed avatars to intercept “green particle beams fired towards them from emitters positioned about five m[eters] in front of them.” They could intercept the beams with their virtual arms or legs and, of course, tails; by design, some of the beams were out of reach by arm or leg and could only be captured by tails, which were about a half meter longer than arm or leg. The researchers included an “orthographic view showing emitter position and avatar” in figure three. You do get a sense of the tail.
Of course, the hapless participants with listless or wayward tails were at a distinct disadvantage, since they couldn’t control their tails with hip movements or anything else.
As with any good game, the denouement is spectacular. The researchers describe the scene, perhaps slipping into a bit of gamer’s enthusiasm:
Immediately and seamlessly following the game stage, a threat occurred to the avatar’s tail, and then to its whole body…. [A] recognizable signal was required, and for this we chose fire. At the climax of the game stage, the emitters all slide into a central position, a high-pitched alarm sounds, and the lighting changes from bright green to bright red. The emitters then burst outwards, roaring flames towards the participant. The avatar’s tail sets on fire and starts to burn down towards the body over a period of thirty seconds. When the fire reaches the body, the body itself bursts into flames and continues to burn for thirty seconds until the displays fade to black and the experiment is over.
GAME OVER!
(By the way, figure four in the article shows the tragic progression to the fateful end, but I didn’t include it here because you might find it far too stimulating. You can read the article!)
But research continued with questionnaires to determine the participants’ sense of ownership and control during the experiment and also how they felt when they their tail turned into, well, a burning fuse. “The threat [of the fire] sought to elicit anxiety responses from participants, thereby providing insight into the extent of body ownership they were experiencing.”
Having a tail that participants could control made the difference in how much body ownership and control they experienced virtually. They were “more likely to both feel ownership of the extended-humanoid body [i.e., the avatar], and to consider the tail extension as being a part of that body.” They also felt a greater sense of the ghastly threat, “manifesting as anxiety or feeling the need to extinguish the flames.”
My brief report grossly oversimplifies the findings of the study. What particularly intrigued me was how the study revealed the enormous ability of human beings to extend control and even assume a different body form within an immersive virtual environment. The researchers noted that the investment in the tail among participants was much greater than a kind of connection we normally experience with, say, a set of pliers, a hammer, or a wrench. Tools are not body parts, but for participants in the virtual world, the avatar tail was “owned” and (at least for the lucky participants with “synchronous” tails) “controlled.” It was virtually part of them, and that counted for a lot.
Beyond those significant results, participants felt emotional and psychological states (like anxiety) as a result of a virtual threat to their virtual bodies. They were internally transformed as their bodies were virtually remade.
Still, I have a question: Where do I sign up to be a participant?
What if you were homeless?
In 2018, Fernando Herrera, Jeremy Bailenson, Erika Weisz, Elise Ogle, and Jamil Zaki reported that they put VR to rigorous test: They wanted to discover whether VR provides a means of “perspective-taking” that effectively changes behaviors better than other methods and media. Their PLOS One article bears the title: “Building long-term empathy: A large-scale comparison of tradition and virtual reality perspective-taking.” In effect, they studied whether VR could help people step into the shoes of others and whether that experience would deepen empathy for the people whose shoes they virtually wore.
Who were the people? The homeless.
“Perspective-taking” takes a couple of forms: it can be more a matter of observation — looking at another who is, in the case of the study, a homeless person, or it can be more first-person — living and directly experiencing being homeless in a virtual environment. The researchers chose the latter, which involved having participants in the study imagine themselves being homeless. VR systems placed participants in the situation of being homeless, allowing them to experience the progression of events leading to homelessness and some of the consequences of being homeless. As a comparative means of “perspective-taking,” they had another set of participants read written narratives. Both VR experiences and the narratives described the same events, but the researchers note that “different types of media such as books, TVs, computers, and VR fall under different levels of immersion and interactivity, [which] may help to explain why extant research has mixed results regarding mediated perspective-taking tasks and their effects on empathy and prosocial behaviors.” For a second study conducted after the first, another type of media was added — an informational packet.
Does the difference in the ways of “perspective-taking” affect the empathy that participants feel (or not) with the homeless? And does the change of empathy stick over a longer period of time?
Here’s the scenario that is cast into the VR and narrative media:
The narrative begins with the participants sitting in their apartment after losing their job and realizing rent is due. Despite selling most of their belongings, participants are not able to raise enough money to pay rent and are evicted from their apartment. Forced to live out of their car, they prepare themselves for the night by trying to find their toothbrush and other items needed to brush their teeth. Participants suddenly hear a police siren and are approached by a police officer who discovers the participants are living out of their car. Due to a city ordinance prohibiting the use of cars as homes in public spaces, the car is impounded. Participants are now traveling on a bus at night for shelter and warmth, when they are warned that there are two men onboard who may serve as a threat to them. One man may try to get unpleasantly close to the participant while the other may try to steal the participant’s backpack. In the bus, participants also interact with other non-threatening homeless people and learn about their experiences…. The narrative represents the lived experiences of veterans, families crippled by medical bills, victims of domestic violence, and drug addicts.
Pretty grim.
The research team was interested in empathy changes over time, too. Over an eight-week period, participants responded to three surveys, and they were provided a “behavioral measure” that included an opportunity to sign a petition supporting Proposition A that was directed at homelessness and, separately, for “Measure B” to provide for affordable housing; provide for a donation from their remuneration for taking part in the study; and letter writing.
The breadth and depth of the data that the studies collected were impressive, and these were not small studies: study 1: 130 participants; study 2: 452 participants.
Researchers summarized results this way:
Participants in the VRPT [i.e., “Virtual Reality Perspective-Taking”] condition reported significantly more empathy and more personal distress immediately after the intervention. However, over time, participants in both the VRPT and NPT [i.e., Narrative Perspective-Taking”] conditions reported similar rates of empathy and personal distress. The results for attitudes toward the homeless and the dehumanization scale show the opposite pattern. Even though both conditions reported similar rates of dehumanization and attitudes toward the homeless at the time of the intervention, participants in the NPT condition thought of the homeless as less evolved over time, and the attitudes they had for the homeless deteriorated in the eight weeks that followed the intervention. In contrast, the VRPT condition, which allowed participants to interact with the virtual environment in real-time, led to more positive, longer-lasting attitudes toward the homeless up to two months after the intervention.
While the article overall is good to read — and I certainly haven’t recounted it in detail — I was particularly intrigued by researchers’ observations in the “general discussion” about the qualities of letters participants were asked to send to elected officials about homelessness. “Participants in the VRPT condition used significantly more first-person plural pronouns (e.g., we, our, us),” the researchers noted. “Sample statements include ‘We must find ways to address the reasons why people become and stay homeless: job loss, mental health issues, high cost of living, lack of affordable housing…’.” In short, the researchers said that “participants in the VRPT condition included themselves as part of the solution rather than telling the elected official what they or the government should do.”
A new and emerging power
Some personal experience with VR and the studies that people have done to explore the range of virtual reality influences have made me realize that I’ve thought too narrowly about the power that VR might have in society. It may well be as transformative as many hope, but it might be misapplied as well. I’m hoping we might be able to have a certain sophistication and sense of responsibility to larger social issues and concerns than we seem to have had in the earlier stages of the commercial Internet.
Got a comment?
My thanks to Anu Kirk and David Zielinski for leading me to interesting papers on VR.
Tags: VR, AR, metaverse, visualization, body transfer illusion, body ownership, homelessness, gaming, avatar
Links, cited and not, some just interesting
Amusing Augmented Reality (AR). I’m ready for bagels flying in my living room, too! Kadet, Anne. “There’s a Giant Flock of Bagels Flying Around My Living Room!” Substack newsletter. CAFÉ ANNE (blog), August 8, 2022.
Cummings, James J., and Jeremy N. Bailenson. “How Immersive Is Enough? A Meta-Analysis of the Effect of Immersive Technology on User Presence.” Media Psychology 19, no. 2 (April 2, 2016): 272–309. https://doi.org/10.1080/15213269.2015.1015740.
By the way, the 3D version of Avatar is in theaters: Steptoe, William, Anthony Steed, and Mel Slater. “Human Tails: Ownership and Control of Extended Humanoid Avatars.” IEEE Transactions on Visualization and Computer Graphics 19, no. 4 (April 2013): 583–90. https://doi.org/10.1109/TVCG.2013.32.
A major study that looks at the effectiveness of VR in influencing empathy over a long-term: Herrera, Fernanda, Jeremy Bailenson, Erika Weisz, Elise Ogle, and Jamil Zaki. “Building Long-Term Empathy: A Large-Scale Comparison of Traditional and Virtual Reality Perspective-Taking.” PLOS ONE 13, no. 10 (October 17, 2018): e0204494. https://doi.org/10.1371/journal.pone.0204494.
Ultimately, it doesn’t look like VR would be any different from the Real World? Sutherland, Ivan E. “The Ultimate Display.” In Proceedings of IFIP Congress, 506–8, 1965.
And selected links from the “daily missives” sent to members of the seminar over the past week:
Parshina-Kottas, Yuliya, Anjali Singhvi, Audra D. S. Burch, Troy Griggs, Mika Gröndahl, Lingdong Huang, Tim Wallace, Jeremy White, and Josh Williams. “What the Tulsa Race Massacre Destroyed.” The New York Times, May 24, 2021, sec. U.S. https://www.nytimes.com/interactive/2021/05/24/us/tulsa-race-massacre.html.
National Portrait Gallery. “The Outwin 2022: American Portraiture Today,” April 22, 2022. https://npg.si.edu/exhibition/outwin-2022. See “Promise Me by Cheryl Mukherji,” 2022. https://portraitcompetition.si.edu/exhibition/2022-outwin-boochever-portrait-competition/promise-me.
Morales, Suzi. “Can Artificial Intelligence Invent Things? A Curious Legal Case Could Have Big Implications for Business.” Observer, September 21, 2022. https://observer.com/2022/09/can-artificial-intelligence-invent-things-a-curious-legal-case-could-have-big-implications-for-business/.
Herrman, John. “AI Art Is Here and the World Is Already Different.” Intelligencer, September 19, 2022. https://nymag.com/intelligencer/2022/09/ai-art-is-here-and-the-world-is-already-different.html.
With Human Help, AIs Are Generating a New Aesthetics. The Results Are Trippy. Aeon Videos, 2022. https://aeon.co/videos/with-human-help-ais-are-generating-a-new-aesthetics-the-results-are-trippy.
Vuocolo, Alex. “The Disappearing Art of Maintenance.” Noema, September 22, 2022. https://www.noemamag.com/the-disappearing-art-of-maintenance.
Mark, thanks for sharing that study on VR and homelessness and for linking to my piece from the other week as well. It's interesting to think of how these types of immersive experiences might be used to help us get more in touch with what it means to be more deeply human. For as many instances as there are for this type of technology to go wrong, there are also potential upsides.
Your newsletter also reminded me of this podcast conversation between Zuck and Lex Fridman that Lex released earlier this week. They did the whole interview in "the metaverse" using avatars. Got lots of thoughts but not sure they're cohesive yet... https://www.youtube.com/watch?v=MVYrJJNdrEg