In actuality, that last part turned out to be the true experiment. While I work on some detailed posts describing exactly how it all works, here’s a brief run-down of the key components:
Generating a Heightmap
- An infinitely tiled heightmap is generated using an algorithm similar to perlin noise, with modifications to achieve a more true-to-life hillscape.
- Alea is used as a PRNG for replicable generations.
Routing a Road
- A starting point is chosen somewhere in the world that’s not too steep (and not too deep). This is the first point of the road’s midline.
- A direction is chosen, and the surrounding heightmap is tested to assess the gradient both laterally and longitudinally.
- The midline then moves 10 metres in the direction that best minimises steepness. Points are encoded in a doubly-linked list, each annotated with metadata such as gradient, road width, and curvature.
- This repeats forever (bounded by a 15km horizon from the vehicle position), taking care to not ever self-intersect (spoiler: this is the thing that took longest to solve).
- The height of the midline points is retroactively smoothed with a 9-point window to avoid sudden sharp changes in elevation.
- Over a short horizon, the coarse 10m midline is annotated with smooth points at a 1m resolution using a quadratic bezier curve.
Rendering the Environment
- A 5x5 grid of large, coarse meshes (I chose 10m resolution, and 1km square on max view distance) are used to render the large-scale scene. This is the “far grid”.
- In proximity to the road, a 5x5 grid of finer meshes, each 10m square to tessellate with the far grid meshes, is marched along the road midline to form a detailed “corridor”. This is the “near grid”, and is generated ahead to a fixed horizon from the vehicle position.
- Simultaneously, the now-overlapped vertices of the far grid are “hidden”, crudely, by sinking them a few metres below.
- Using proximity to the road midline, the near grid heights are interpolated between the height of the road and the height of the underlying environment for a seamless transition.
- An additional 3x3 grid of yet-finer tiles, also 10m square, are generated to a shorter horizon for extra detail. These overlap (and displace) the medium-detail tiles from the prior step.
- The road is rendered as a simple rectangular mesh of 3 high-detail chunks and 9 low-detail chunks, each 100m in length, cycled out as the vehicle progresses.
- The ground texture uses world-coordinate UVs, blended with perlin noise for variation in grass colours.
- A cliff face texture is blended, with vertex displacement, based on steepness.
- I accepted defeat in implementing dynamic shadow mapping, and instead have all lighting be top-down for simplicity. A dark texture is applied under the foliage map to give the impression of tree shadows.
- I re-used the heightmap algorithm with slightly tweaked parameters to produce a tree map that would vaguely follow elevation, giving more trees around lakes.
- All foliage is made up of simple sprites and instanced geometries, with multiple variations stored in one texture sampled using noise in the vertex shader
- Each wheel has its dynamics calculated independently, using the usual kinematic equations around gravity, surface friction, and ground normal. The chassis position is resolved consequently.
- Walls exist as annotations in the road midline — a simple float indicating the distance of the wall from the midline. Collisions then become a distance check.
- For now, collisions with anything else are ignored; there’s nothing zen about crumpling around a tree…
- The biggest saving came from merging the geometries of the near grid and carefully managing instance sizes to minimise draw calls.
- The vehicle’s progress is constantly tracked against the road midline, and environment visibility is checked against that. Objects behind the vehicle are despawned and pooled to be re-used.
- Prescience of the path of the midline means that future geometry and details like road signs can be processed long before they are displayed, allowing for gentler generation orchestration.
- A handful of library methods, such as bounding sphere or normal calculations, have been overridden or sidestepped where they can be solved during generation.
- Almost all memory is accounted for and re-used, though this is an area for improvement.
- Any situation which doesn’t strictly require an independent random (such as the rotation of a particular tree) samples from a once-generated pool.
- The last resort is to offer a range of quality settings to cater to different systems — such as by disabling foliage, or looking up height from the coarse far grid rather than a heightmap query.
Let me know which elements I might have skipped over here you’d like to see covered more fully in the write-up. If you want to see the full shebang, you can follow me here or on Twitter.