Skip to main content

NeRF stands for Neural Radiance Fields, a type of artificial neural network used to create 3D models from a set of 2D images. It works by learning the color and density of light at every point in a scene, allowing for the synthesis of photorealistic images from new viewpoints. NeRF has become a popular technique in computer vision and graphics for tasks such as 3D reconstruction and virtual reality.

Based on this, I think I see what’s going to happen with foliage in the next year or two. To any MIT student or developer reading this, here’s your challenge. What we need is something like LeafSnap, which is what I used earlier to identify the DSLR images of local leaves I gathered. The Luma AI app will, in addition to the ability to identify leaves, have the functionality to detect what zones of the plant you’re NeRF scanning are bettered rendered procedurally, and then with knowledge of the leaf type and the capabilties we see in Grove3D or Houdini Bushify, replace the NeRF geometry of the scanned foliage with procedurally rendered vines and leaf clusters. Quixel, you should get on that.

This past summer, I captured a series of LumaAI scans in the thick woods that one encounters as a visitor, after walking through the main Visitor Center building and descending the stairs. Using the new LumaAI plugin for Unreal, it is possible to pull this into the scene, and of course to retexture and retopologize objects you create.

I have not yet implemented this, because foliage is notoriously difficult to render both in photogrammetry NeRF, and presumably gaussian splatting, the new kid on the block. It isn’t that you can’t individually capture batches of usable leaves, but that the means for really making a scene that allows you to experience the subsurface scattering and the completeness of each leaf, I don’t see how that happens without an extra step of interpolating and replacing certain kinds of data.

Anyway, I gathered a series of these midsummer to experiment with, and as with everything in this journey, the future is so wide open I am eager to keep pushing this project to see how efficiently we can create these kinds of spaces, and then use them for immersive experiences in VR.