Extreme weather events are now dramatizing the effect humans are having on the planet. Yet we still face great challenges in staving off irrevocable climate change. It isn’t simply about convincing skeptical politicians — it’s about getting the public to visualize how their behaviors (like driving a gas-guzzling car or living in an energy inefficient home) are contributing to a problem that may only manifest itself completely in future decades. Our previous research has shown that Virtual Reality is uniquely effective at changing conservation behavior, as evidenced in studies about reducing paper use and about hot water conservation.
We are currently pursuing two projects that will utilize the affordances of virtual reality to teach people about the effects of climate change in marine environments:
Very few people have firsthand experience diving among reefs that are teeming with coral fish life and thus, most of us have no exposure to the animals that will eventually disappear if our behavior doesn’t change. And even those who do can’t see the degradation in real time. Most people have either never heard of ocean acidification--the process by which the ocean becomes more acidic as it soaks up the carbon dioxide we release into the atmosphere--or wrongfully assume it is another term for acid rain. In our experiments, learners experience multiple phases of the process that results in ocean acidification. The first phase personifies the process of ocean acidification as a result of burning fossil fuel, in which learners follow CO2 molecules as they are released into the atmosphere and absorbed by the surface water of the ocean. Subsequent phases involve the learner embodying or interacting with difference ocean species and witnessing the changes in those species’ ecosystems as a result of increased CO2 presence. These simulations will guided by our marine science collaborators — Kristy Kroeker and Fio Michelli, and will be formatted to accommodate lessons for various age groups. Our collaborator Roy Pea will guide the learning science portion of the design, testing, and outreach for the simulations. We will use mobile VR equipment to collect data from a large and demographically diverse sample outside of the laboratory context. This project is sponsored by the Gordon and Betty Moore Foundation. The project is prefaced by the lab’s previous research on ocean acidifcation. The video below from SF Gate focuses on the project, which allowed a participant to embody a piece of coral and learn about ocean acidification. To read the full article, which touches on other environmental projects at VHIL, see here.
This project will transfer movement data from electronically tagged fish in the kelp forests of Monterey Bay into virtual reality where humans can enter the underwater realm to observe virtual versions of live fish. Our collaborators from the Goldbogen lab will be constructing and implementing the underwater tracking sensors, and we will be building the systems to display fish avatars. We are studying how the experience of seeing “real” fish avatars in virtual reality differs from recorded or simulated agents of those fish, from a psychological standpoint. Our previous research has demonstrated that avatars facilitate more physiological arousal and learning than agents. The end goal of this project is to let anyone on the planet “adopt a fish” with a head-mounted display in VR. This project is sponsored by the Stanford Woods Institute for the Environment.