I need to catch up with the director. My progress has stopped, my enthusiasm sapped by not breaking through a crop of creative and technical challenges. I've suggested Monday, which gives me today and tomorrow to actually have something to show.
Let's break it down:
What do we need?
A large terrain, roughly 30km x 30km, showing the dry canyon/broken ground where the next scene of the episode takes place. Surrounding the broken ground are mountains, and in the distance, there is an island on a lake. There's a city on the island from which we can see smoke rising.And why is that hard?
It's a huge terrain viewed from high altitude, so you see it more than a terrain viewed from on or near the ground. Getting enough detail that looks right is really hard. I've used World Creator to sculpt elements and patch them together. That works great so long as the different elements are scaled consistently. I've found that some elements don't align with erosion and flow patterns.Once a terrain is created, getting ground patterns through a scatter system is not really feasible. Scattering systems use instancing, so they can apply thousands of grass clumps using only one 3D model, which is super memory-efficient. However, the number of instances you can apply to a terrain is not unlimited. At some point, your viewport will buckle and probably crash Blender. Even if it scatters off in the viewport and view them selectively, eventually, Blender will fall over trying to create the shopping list of elements to feed to the rendering engine. You can scatter distant forests in a big scene from the ground where trees that are more than 5 miles away are occluded by hills, but when you are at 20,000 feet, there's not much blocking your view of hundreds of square miles of trees. You need to fake it!
Things I could do:
1) Use Render Layers to break the scene up. The output needs to be reassembled in the compositor, but this approach lets you render the scene in pieces, using fewer resources each time and allowing you to fix things selectively. This may require the terrain to be broken into sections.2) Use shader tricks instead of scatters. When you have millions of trees and bushes to apply to a vast landscape, it may be better to use trickery.
![]() |
| Test terrain |
Below is practical, performance‑focused guidance for using Blender Geometry Nodes to create plants and vegetation that scale to very large terrains, while avoiding classic scatter‑system bottlenecks. I’ll clearly separate what is explicitly supported by documented behaviour from expert recommendations, as required.
1. Think “Hierarchical Instancing”, Not “Scatter Everything Flat”
What the docs explicitly confirm
Blender Geometry Nodes support nested instancing, where instances can themselves contain other instances, and only unique meshes are actually stored in memory. This dramatically improves performance for large scenes. [docs.blender.org]
However:
Blender currently supports only 8 levels of nested instancing for viewport and rendering; deeper trees must be realized. [docs.blender.org]
Recommended plant structure (expert guidance)
Build vegetation as multi-level instance trees:
Terrain
└── Biome instance (forest / grassland)
└── Species instance (pine, fern, grass type)
└── Variation instance (size/shape seed)
└── Mesh (leaf, blade, branch)
This ensures:
One mesh processed once, reused millions of times
GPU memory stays stable even at continental scale
Wind animation and transforms stay cheap
Avoid directly instancing final meshes at terrain level.
2. Never Realize Instances Unless You Absolutely Must
What the docs explicitly confirm
Realize Instances converts instanced geometry into unique geometry.
Once realized, every node downstream processes all geometry individually, increasing memory and CPU cost significantly. [docs.blender.org]
Best practice
Keep instances unrealized all the way to the final output
Only realize:
Meshes needing deformation (cloth, sculpting)
Export pipelines that cannot handle instances
Leaves, grass, small debris should never be realized.
3. Use Camera‑Relative Density (Not Global Density)
What community sources explicitly note
Large Geometry Nodes setups slow down mainly due to:
Huge point counts
Viewport evaluating everything at once
Memory spikes proportional to terrain size [blenderartists.org]
Recommended solution
Use distance-based falloff driven by camera position:
High density near camera
Medium density in mid‑range
Very low or zero density beyond
This keeps:
Viewport responsive
Render performance stable
This approach mirrors built‑in LOD systems used by procedural terrain tools that emphasize automatic level-of-detail control. [blenderkit.com]
4. Avoid Alpha Planes for Vegetation in Cycles
What’s explicitly stated
Community reports confirm that alpha‑textured planes (grass cards) can:
Hurt Cycles performance
Require higher ray depth
Cause rendering artifacts [blenderartists.org]
Recommended approach
Prefer low‑poly meshes over alpha cards
Use instances to compensate for higher vertex counts
Use opacity only where unavoidable (e.g., distant impostors)
5. Partition Terrains into Tiles, Not One Mega‑Mesh
What performance discussions show
Large single meshes combined with Geometry Nodes cause:
Long evaluation pauses
RAM saturation during viewport edits [blenderartists.org]
Proven scalable approach
Split terrain into tiles / chunks
Each tile has its own Geometry Nodes modifier
Use world coordinates to keep continuity
This allows:
Partial evaluation
Streaming‑like behavior
Easier debugging
6. Limit Geometry Node Evaluation in the Viewport
Explicitly documented workaround
Blender provides an Is Viewport field input that allows different values for viewport vs render evaluation, useful for density reduction. [blenderartists.org]
Practical setup
Viewport density: low
Render density: full
Switch node controlled by Is Viewport
This alone can turn an unusable scene into an interactive one.
7. Prefer Data‑Driven Variation Over Unique Meshes
What Geometry Nodes excels at
Randomized transforms
Attribute‑driven material variation
Curve‑based procedural growth [youtube.com], [entagma.com]
Recommendation
Instead of 20 different plant meshes:
1 mesh
1 geometry node graph
Variation via:
Scale randomness
Rotation noise
Color attributes
Curve‑driven deformation
This keeps instance counts high but memory low.
8. Know When Geometry Nodes Beat Scatter — and When They Don’t
Explicitly documented limitation
Developer discussions show Geometry Nodes instancing still has higher viewport overhead than legacy particle instancing in some cases. [developer....lender.org]
Practical takeaway
Use Geometry Nodes when you need:
Procedural logic
Biomes
Growth systems
Terrain awareness
If you only need:
Simple grass scattering
No rules, no variation Legacy systems or specialized tools may still be lighter.
Final Mental Model for Huge Worlds
Think of Geometry Nodes vegetation as:
“Procedural data graphs generating instance trees, not geometry.”
If:
You see millions of vertices → something is wrong
You see only a handful of meshes instanced millions of times → you’re doing it right
If you want, I can next:
Sketch a reference node graph layout for biome‑based vegetation
Show how to implement camera‑based density falloff
Compare Geometry Nodes vs Geo‑Scatter vs legacy particles for large worlds
Just tell me what you want to go deeper into.

No comments:
Post a Comment