Building trees in the Metaverse might actually save the forest

A team of programmers and scientists are creating 3D models of "certified real" trees — the first of their kind.

In the Carolina Sandhills, the hot sun burns you by 11 am. But this day was pleasant in the woods, where the loose canopy gently shades the forest floor. Stocked up on sunscreen, 360 cameras, and sandwiches, the group drove about an hour south of Durham to a forest in Southern Pines.

Julie Moore, an endangered species biologist, knew precisely where to find her favorite tree: the longleaf pine. Her goal was to give the crew of cinematographers and gamers an iconic representation of the tree. They wanted to create a digital version for the Metaverse: the collection of all virtual worlds — including the internet, shared virtual spaces (like in video games), and conceptual virtual worlds, like the one created for the movie Ready Player One.

The group is the first to build scientifically accurate virtual trees. They are creating 3D models, in fine detail, that look and behave just like real trees at every life stage. These “certified scientific” trees are the first of their kind, pioneering a new level of authenticity in VR.

It’s Not “Just a Pine Tree”

The Sandhills region is known for dry sandy soils, wiregrass, and longleaf pine trees. Longleaf once covered more than half of the coastal plains, from eastern Texas to southeast Virginia. Today, only 1% of that area still supports longleaf pine.

Moore says the team’s first attempt at creating the digital longleaf “wasn’t too good, as far as ‘realistic’ goes.”

“It didn’t look like a longleaf. That was the problem. It looked like ‘just a pine tree.’ Unless you’ve seen one — and smell it and hear it and get close to it — its essence is hard to find. It is a tree of a system. It’s not an individual tree,” says Moore.

The team had initially used a cultivated tree stand at Duke Forest as their muse. But to know a longleaf pine, to really know it, you have to know the entire forest.

To know a longleaf pine, to really know it, you have to know the entire forest.

As they hiked into the forest, a high whistling noise vibrated through the treetops. It is a distinctive sound of this forest, owing to the lengthy needles of the longleaf pine tree. They examined big trees, some 80 feet tall and a few hundred years old, and little two-inch seedlings, less than a year old.

Moore talked nonstop — about the grasses, the bark, the needles, and things blooming in the ground cover.

Observing the bark, “it’s like paper stacked on top of each other,” filmmaker Derek Rowe said.

“When you stand up to the trunk, you can really see stuff pop out,” replied 3D artist Samuel Rice. “I want to capture that.”

For Rice, trees and nature had always been a “background element” in a 3D scene. They didn’t require much detail. But with this 3D modeling project, Rice could finally infuse texture into a model tree.

From an educational standpoint, from a science standpoint, they need to look the way it does in real life. You need those details to be there.

Derek Rowe

The idea for building a 3D model of a longleaf pine came together over a cup of tea at The Honeysuckle Tea House in Chapel Hill, North Carolina. The tea house is run by a nonprofit called Unique Places to Save. They make strategic investments to help preserve natural spaces. For instance, the teahouse operations help protect the surrounding natural space from development.

Founder Jeff Fisher told Rowe that if the virtual world, like in video games, were accurate, it can be a doorway to rediscovering nature. VR nature could make wild spaces accessible for people who can’t reach them for reasons like physical disabilities, distance, or fear.

Rowe understands this sentiment well. In 2015, he launched WildEyes to film the national parks in VR so inner-city children can experience them. Fisher and Rowe put their heads together, amassed a diverse team, and NatureXR was born. Their goal is to leverage the VR world to help scientists, decision-makers, and other stakeholders fundraise and simulate nature in a new kind of conservation: virtual conservation.

Rowe says the existing 3D tree models aren’t good enough. “They’re too low-quality. You walk up to them (in the virtual world), and they’re so pixelated. In video games, people mostly don’t care,” says Rowe. “From an educational standpoint, from a science standpoint, they need to look the way it does in real life. You need those details to be there.”

Needle by Needle

The VR world that mimics the real world is only as good as the objects you put in it. Many of those objects come from Quixel’s Megascans Library, the world’s largest library of 3D assets. Eric Ramberg, Chief Content Officer wrote that full trees are “the most requested type of asset.”

But tree avatars, which accurately depict their real-life counterparts, are entirely absent from the digital marketplace. It didn’t take long for the team to realize why: True-to-life tree avatars don’t exist because they are much too difficult to create using normal methods.

The team first considered using photogrammetry, a standard method for making models for VR. By taking overlapping photos of an object, then using a computer program to piece them together, they can make 3D models of buildings, furniture, or other objects.

Tree branches have a constant gentle motion, which makes the models look blurred with photogrammetry. Every branch blocks another branch, making it impossible to photograph every part of the tree.

Biologist Duncan J Irschick, says that 3D modeling for the gaming industry, which does not require scientific accuracy, isn’t too difficult. But to model a tree accurately would be “incredibly challenging.”

Irschick is the director of the Digital Life Project, a project launched by researchers at the University of Massachusetts, Amherst. In 2017, they began building scientifically accurate 3D models of animals as a type of digital Noah’s Ark. The idea was to preserve animals before they go extinct so researchers can continue to study them — a virtual version of them. They’ve created 3D models of animals from an endangered sea turtle to a hissing cockroach, but they haven’t yet created a tree.

“I’m a little doubtful that they will be able to do that easily because that means they have to render every single pine needle or leaf,” Irschick says. The Digital Life Projects embeds their 3D models with metadata on how the animal moves or behaves. “If their goal is scientific accuracy, which is what we do with our animals, then that’s very, very time-consuming. I think it’d be very tough.”

Rowe describes NatureXR’s work as building a “procedural tree.” Many 3D models that are recreated from photographs are identical representations of the real-life subject. Rowe says, for a tree, that wouldn’t be good enough. Imagine creating a forest with dozens of identical trees — hardly a realistic vision of a dynamic forest.

Instead, NatureXR’s “procedural” trees are more like a formula for a tree. They are importing photos of branches or needles into SpeedTree, a vegetation modeling software. The software allows them to enter parameters such as alternate or opposing branching patterns, or the number of cones present at each life stage. Then the software generates a forest of trees — uniquely individual, but accurate leaf by leaf, needle by needle. The stunning make-believe forests in the movie Avatar were created using SpeedTree.

Simple 3D objects have characteristics that make them appear “real.” For example, a 3D rendering of a basketball has a realistic bounce, shine, and shadow. NatureXR needs that same realism, and more, to make it possible for researchers to model the impact of conservation techniques in advance, view forests of the past and future, and persuade policymakers to instill strategic protections.

“I think it’s really important to create accurate 3D models of life on earth. We found that to be important for heritage, because trees just like other animals can be driven to extinction,” Irschick says. His 3D animal models are open-access and available for anyone.

NatureXR is flying a drone with a LiDAR scanner above the forest to capture important information about the tree’s ecosystem. The scanner maps the ecosystem, creating metadata for the 3D model that will help programmers place trees in a landscape that mimics the real world. It is the tree and ecosystem in one, the forest and the trees.

Virtual Reality, a Tool for Conservation

Improvements in technology are prompting researchers to use VR to help stakeholders understand our impact on the natural world. Scientists have modeled conservation impacts, evaluated distant species like the endangered jaguars in the Peruvian Amazon, or visualized future ecosystems like coral reefs.

Scientifically accurate 3D modeling opens up new opportunities for research because scientists can manipulate the virtual world endlessly, without consequence. They could create future ecosystems, then wipe out an endangered species, instill a conservation effort, and check the results — all within a virtual world. It is like a conservation video game where the results can be used to understand ecosystems.

Model 46A – Green sea turtle with shark bite by DigitalLife3D on Sketchfab

At Stanford’s Virtual Human Interaction Lab (VHIL), researchers study the impact VR can have on learning, behavior, and conservation. One study found that the increased movement and interaction within a virtual world increased knowledge retention among viewers. Another study compared VR to print media about deforestation and tree conservation. They found that experiencing the VR media led participants to consume 20% less paper than participants who read a print description of tree cutting. In other words: VR is more effective than reading facts and figures.

Anna Carolina Muller Queiroz, a cognitive psychologist at the VHIL, says that the critical ingredient to VR is that “your brain understands the virtual world as real and responds accordingly.”

Géraldine Fauville, a postdoctoral researcher in Education at VHIL, adds, “The easiest way for people to understand what VR can do is to put them in VR.” She describes a simple “plank experience,” which they often give people who tour the lab. Participants put on a VR headset and are transported to a virtual representation of the same room — with one difference: the floor falls away below them. Now they are standing on a tiny plank and asked to walk across it. Their hearts race, hands sweat, and, despite knowing they are in a safe environment with colleagues, most people can barely take the first step.

“I think that’s why it’s important to have a VR experience, like with the trees that look real, because even though you very well know that you are in the virtual world somewhere, your body treats it as a real experience,” says Fauville. “I would personally love to walk freely in a forest that I’ve never experienced before or will never. I think the idea is brilliant.”

“Digital twins” are realistic simulations that behave like the real world. They can allow scientists to conduct research they otherwise couldn’t. For example, test-driving autonomous vehicles could put researchers in danger, but not when using VR.

Digital twins are only effective for conservation if the 3D model looks and behaves exactly like its real-world counterpart. The Digital Life Project is using software and machine learning to reconstruct the movement of the animals they model and embedding them with metadata — information like where the species is found, or the animal’s sex and age.

“Metadata means a lot. Because if you don’t have that metadata then scientists can’t really use (the 3D model),” Irschick says.

“You need the tree to be the way it is in real life. Otherwise, it is useless,” adds Rowe. “Maybe one of the reasons we can’t figure out how to be a more sustainable, cooperative species on this planet is because we can’t imagine it. If we could imagine what it was like in VR and say, ‘Look, this is pixel for pixel the real world, and we’ve proved that if we do this type of conservation, we’ll be sustainable.’ Would we then get traction from policymakers?”

To Know a VR Forest, Is to Know the Real One

Back in the forest, Moore explains that longleaf pine ecosystems thrive on controlled burns, which clear the undergrowth for new trees to sprout. The trees around them are bare on the bottom half, and branching at the top — a sign that the forest was burned recently, pruning the trees and clearing the way for new seedlings to grow. Moore explains that when people on the eastern seaboard stopped managing the forest with controlled burns, the whole system suffered.

“One of our great needs is having foresters who understand longleaf and how that forest can be managed. To actually do that, through this (VR) tool — to say, ‘Let’s put a fire here and see what happens.’ I can really see it helping tremendously,” says Moore. “I could see it being used to help people understand how to manage longleaf. And that is one of our great lacks right now.”

I could see (VR) being used to help people understand how to manage longleaf. And that is one of our great lacks right now.

Julie Moore

At the end of the six-hour walk, Rice grabs a pinecone before leaving the forest.

“I wanted to make a model of it and make sure I understood how it physically opens up,” he says, adding that the project inspired him to pay more attention to nature.

At home, he watches the pine cone open in dry air and close tightly in a cup of water — subtle features he wants to embed in the 3D rendering — and another sign that the longleaf pine is anything but “just a pine tree.”

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at tips@freethink.com

Related
How years of fighting every wildfire helped fuel the Western megafires of today
The current approach to fire management poses unnecessarily high stakes for forests. Here’s why fighting every fire does more harm than good.
Here’s how growing plants on the Moon could benefit Earth
Making plants grow on the Moon could be instrumental in helping gardens to grow greener on Earth in the face of climate change.
AI is going to revolutionize the weather forecast
A tech startup in San Francisco is going to change how the world sees the weather.
Did life evolve more than once? Researchers are closing in on an answer
Current scientific consensus is that life emerged from non-living molecules in a process called abiogenesis. But if life emerged once, why not more times?
Up Next
Underwater Base
Exit mobile version