Project case study

Realtime AR Grass Simulation with Lidar

A high-fidelity AR grass simulation for iPad that uses GPU instancing and LiDAR-informed alignment to anchor dense realtime foliage to physical space.

Year

2025

Roles

XR Development · Graphics Programming

Platforms

iPad · AR


AR Graphics LiDAR
Realtime AR Grass Simulation with Lidar

Narrative

Context, constraints, and execution

Each flagship project is framed as a concise case study: what needed to work, how the solution was approached, and what the final experience delivered.



Overview

This experiment focused on making dense, stylized vegetation feel grounded in the real world instead of floating as a visual effect. The brief was simple: render a lot of grass on an iPad, keep it lit and reactive, and make it sit convincingly on top of scanned physical geometry.

Problem

Mobile AR scenes rarely leave room for dense geometry, dynamic shading, and convincing world alignment at the same time.

Focus

Push visual density without abandoning realtime responsiveness on iPad-class hardware.

Result

A grass system that feels materially present rather than composited on top of camera feed.

Challenge

The hard part was balancing fidelity and performance. Thousands of individual blades create a much better illusion than billboard-style patches, but they also multiply draw cost quickly. On top of that, AR content only looks believable when it respects the surfaces and orientation of the scanned space.

Approach

I treated the piece as a rendering study instead of a simple placement demo. The visual target was lush, readable grass with enough directional variation to avoid visible repetition, while the technical target was stable performance on a mobile device.

Implementation

Leveraging GPU instancing, I implemented a high-fidelity grass simulation where every blade of grass is rendered as a fully 3D object with dynamic lighting. This approach enables detailed visualization on an iPad without compromising performance. Additionally, the simulation automatically aligns the grass with real-world geometry by integrating LiDAR scan data, ensuring that the virtual environment accurately reflects physical spaces.

Images can be used to create different patterns in the grass direction, with the alpha value determining the rotation of that “pixel” of grass. That gave the system an art-directable control layer instead of locking the look to purely procedural noise.

Media notes

The embedded demo is the clearest proof of the effect because it shows both density and anchoring in motion. More than a shader exercise, it demonstrates how much perceived realism comes from orientation, lighting, and placement discipline working together.

Outcome

The final result reads as a compact case study in mobile AR rendering: use instancing where density matters, use LiDAR where grounding matters, and keep enough authored control in the pipeline to shape the final motion and patterning.