From Garage to Browser: Forged.build and the WebGPU Revolution

0
6


The Forged site began with a simple question: what if a studio website could feel like a place? Not just a scroll of thumbnails or a polished homepage, but a world you could actually step into and explore.

When I started Forged, I knew I wanted to merge my two passions: cars and technology. The automotive world is full of spaces that radiate purpose—garages, machine shops, fabrication labs—but the digital industry rarely creates anything that feels the same way. The site became a way to connect those two worlds, an experiment in building a digital space that captures the spirit of the physical one.

Instead of marketing pages, we built rooms: a garage, a desk, a workbench, a testing space. Each one anchors part of the story: who we are, what we make, and how it’s built. The goal wasn’t to showcase the work; it was to create a workshop you’d actually want to step into.

From Garage to Browser

The visual language came straight from the real world. Think Han’s garage from Tokyo Drift for vibe intersecting with clean-meets-industrial aesthetic of Singer or Gunther Werks. The Forged space borrows that same balance of polish and grit: concrete floors, metal shelving, soft reflections on paint and glass.

Early on, I mocked up concepts using AI image generation — not as final art, but as a fast way to show what was in my head: the garage exterior, the door rolling open, the contrast between rough and refined. That shortcut let us iterate visually before a single polygon was modeled.

From there, we built the full environment in Blender, modeling each element, dressing the scene with props, and baking lighting passes to capture realistic falloff and shadow. The challenge was to make everything feel cohesive with multiple rooms forming one continuous workshop, all illuminated as if they belonged to the same physical space.

The Blender-to-WebGPU Workflow

Every scene in the Forged site starts in Blender but comes to life inside our custom WebGPU engine.

In Blender:

  • Model and layout each environment.
  • Assign base materials (normals and roughness maps where needed).
  • Bake lighting into high-resolution textures, packing as many objects per atlas as possible.

In WebGPU:

  • Import the geometry (GLB with Draco compression) and lightmaps (separate KTX2 files).
  • Assign custom shaders through an in-engine GUI.
  • Tweak the material behavior per scene using shared shader modules — PBR, microfacet BRDF, and reflection blending.

We don’t use the strict glTF PBR spec. Each material is tuned manually to balance performance and realism — what I call part of the “fake tracing” pipeline. Some objects, like the car, run full PBR with roughness and metallic maps. Others, like walls or furniture, just use baked lighting plus microfacet scattering for a hint of motion in the highlights.

The car is the only thing using true PBR, but for a deep dive on realistic rendering cars check out the Aura Color Explorer case study.

Blender gives us the precision of true raytraced lighting; WebGPU gives us real-time flexibility to make the space feel alive.

A simplified overview of the pipeline:

Blender → Bake Lighting → Compress (Draco + KTX2)
→ Import to WebGPU Engine → Assign Shaders → Tune Materials

Lighting the Workshop — The “Fake Tracing” Pipeline

From the start, the goal wasn’t just to make the Forged workshop look realistic, but to make it behave that way. You can fake almost everything on the web, but light is the hardest thing to fake convincingly.

I call it fake tracing. It’s a hybrid lighting approach: baked realism for global mood, and lightweight dynamic cues to keep the scene breathing. It isn’t physically correct; it’s perceptually correct.

Two Layers of Light

  1. Baked diffuse: All indirect light, color bleed, and ambient tone are baked in Blender.
    These maps handle the “world lighting”: the way the environment fills with soft reflections.
  2. Dynamic specular: A small real-time system inside the WebGPU engine handles only the highlights: the glints on metal, the roll-off across a leather chair, the shimmer on paint as you scroll.
Specular highlight exaggerated for demonstration

Each material gets a simple blend control — a 0-to-1 value that decides how much the dynamic layer contributes on top of the baked base.

vec3 baked = texture(Lightmap, uv).rgb;
vec3 spec = MicrofacetBRDF(normal, viewDir, lightDir, roughness);
vec3 color = mix(baked, baked + spec, uDynamicBlend);

Microfacet Distribution

The specular layer uses a standard microfacet BRDF distribution with a custom roughness curve that exaggerates grazing highlights. It’s not a full PBR stack; it’s stripped down for speed.

vec3 MicrofacetBRDF(vec3 N, vec3 V, vec3 L, float roughness, vec3 F0) {
    // Half vector
    vec3 H = normalize(V + L);

    float NdotV = max(dot(N, V), 0.0);
    float NdotL = max(dot(N, L), 0.0);
    float NdotH = max(dot(N, H), 0.0);
    float VdotH = max(dot(V, H), 0.0);

    // Avoid denormals / division issues
    const float EPS = 1e-5;

    // GGX normal distribution function (Trowbridge-Reitz)
    float a   = max(roughness, 0.001);     // perceptual roughness [0..1]
    float a2  = a * a;
    float d_denom = max((NdotH * NdotH) * (a2 - 1.0) + 1.0, EPS);
    float D = a2 / (3.14159265 * d_denom * d_denom);

    // Smith masking-shadowing with Schlick-G term
    float k = (a + 1.0);
    k = (k * k) / 8.0;
    float Gv = NdotV / max(NdotV * (1.0 - k) + k, EPS);
    float Gl = NdotL / max(NdotL * (1.0 - k) + k, EPS);
    float G = Gv * Gl;

    // Schlick Fresnel
    vec3  F = F0 + (1.0 - F0) * pow(1.0 - VdotH, 5.0);

    // Specular term
    vec3 spec = (D * G) * F / max(4.0 * NdotL * NdotV + EPS, EPS);

    // Energy conservation guard + NdotL factor for light transport
    return spec * NdotL;
}

These highlights move with the camera, giving the illusion of depth and motion even when the baked light is static. It’s one of those tiny perceptual tricks that convinces your brain the world is real.

Why It Works

Fully baked scenes look beautiful until they move — then they feel dead. When light scatters and shifts just a little as you scroll, something fires in your brain: this is alive. That’s what fake tracing aims for.

Instead of chasing perfect physical accuracy, we chase that moment of disbelief where you stop thinking about shaders and just feel like you’re standing in a real room.

Scroll as a Directional Language

If lighting makes the Forged world believable, motion makes it feel tangible. Scroll is the only input every visitor shares, so instead of treating it as a trigger, we treated it as a language.

One Axis at a Time

Scroll isn’t a joystick. It’s a single-axis gesture, and the site respects that. Each scene moves in only one direction, either forward through space or downward through the structure, never both.

If you try to mix them, it feels wrong.

As soon as the camera curves or circles while tied to scroll, your brain expects game-style controls. By committing to one axis per section, the experience stays intuitive: the mouse wheel moves the world in a straight line, and your body instantly understands it.

The Cut

Transitions between spaces work like film edits. When you reach the stairs, for instance, the camera doesn’t climb them — it cuts to the new floor. It’s the same continuity trick you see in films or games: motion continues in one direction, but the geography resets.

That single cut keeps the rhythm smooth and reinforces the illusion that the whole site exists inside one continuous building.

Tuning the Rhythm

Getting the pacing right wasn’t about keyframes or storyboards; it was about distance.
Each scene has a defined height in view units — literally, how many screen-heights you scroll to travel through it.

Too fast? Add more view-heights. Too slow? Remove a few. After a handful of tweaks, the flow locked in.

// simplified scroll-to-camera mapping
const sectionHeight = vh(300); // 3 view heights per section
camera.position.z = scrollProgress * sectionHeight;

That’s it. No physics, no easing equations — just proportion. Once those proportions felt right, the site moved with its own gravity: heavy enough to feel physical, light enough to glide.

Post-Processing and Ambient Tricks

After the lighting and scroll systems were locked in, the final layer was about atmosphere — the finishing touches that make the world feel grounded in reality.

Tone Mapping and Camera Effects

A custom tone-mapper balances baked color with dynamic specular energy, followed by a soft vignette, a trace of RGB shift, and a hint of film grain. These small imperfections add a layer of analog warmth — the difference between something that looks rendered and something that feels shot.

Reflections and Occlusion

To keep the space connected, we added real-time reflections across the floor — a subtle mirror pass that anchors everything to the same plane. On top of that runs screen-traced directional occlusion (SDDO), a lightweight shader that fakes ambient occlusion while also injecting a hint of light bounce.

This hybrid pass gives depth without killing performance — a little shadow under the tires, a glow around bright surfaces — the kind of detail your eye catches subconsciously.

SSDO is way too complex to cover here, but this is the paper that was referenced for our implementation.

Even with everything baked, light still needs to move. As you scroll, those fake reflections and occlusion passes shift just enough to make the space feel dynamic. It’s not physically correct, but perceptually it’s spot-on: the difference between watching a still render and standing inside one.

In the end, these post-processing tricks aren’t about decoration. They’re about deception — convincing your eyes that something impossible is happening right in a browser tab.

Performance Tuning

With so much happening in each scene, like baked and dynamic lighting, reflections, occlusion, and scroll-driven motion, keeping the site responsive on every device came down to experience, not automation.

After building real-time graphics for more than a decade, I’ve developed a sense of what each class of hardware can handle. Instead of complex benchmarks, we rely on intuition and lightweight detection. High-end GPUs get everything; mid-tier devices drop the heavier passes; older phones and budget PCs still run the world, just with fewer dynamic tricks.

The guiding principle is simple: smooth motion first, fidelity second. If performance dips, we sacrifice visual luxuries long before scroll fluidity. The camera should always feel weightless.

Tier Key Features
High All effects, full reflections, dynamic specular, SDDO
Mid Half-res lightmaps, reduced reflection updates
Low Baked lighting only, no reflections or post-FX

The result is performance that feels tuned by hand — like tuning an engine by ear rather than by numbers. You can hear when it’s running right, even if you can’t measure it.

Why WebGPU Changes Everything

The entire Forged site runs on WebGPU, and the biggest change from WebGL isn’t just theoretical—it’s physical.

In WebGL, you could hear when things got heavy. The fans would spin up. With WebGPU, they stay silent.

That quietness says everything about the difference in architecture. In WebGL, almost every off-screen effect—blurs, reflections, light passes—had to be processed through framebuffers, which meant treating every operation like a texture. It worked, but it came at a high cost.

With compute shaders and workgroups, WebGPU can perform those same calculations directly on the GPU with almost no detour. Tasks like light scattering, reflection prefiltering, and blur convolution now happen in parallel, without heating the machine.

The impact is real: smoother performance, cooler devices, and a much higher ceiling for what’s possible in a browser. It allows us to run effects continuously that would have been prohibitive in WebGL, not just once per frame but every frame.

For us, WebGPU isn’t just a renderer; it’s the core architecture behind everything Forged will build next—the foundation for websites, installations, and interactive films that all share the same creative engine.

WebGPU finally gives web experiences the same expressive range as native engines. The difference is that we can now deploy them instantly, anywhere, with no installation and no compromise.

Conclusion

From the first baked test renders to the first live scroll pass, everything just… worked. The lighting, the layout, the transitions — they all clicked faster than expected.

That mattered, because this was the first Forged project. It set the tone for everything that came after. If the site had struggled or felt compromised, it would’ve been harder to believe in the six-week sprint that followed. But when the garage door opened for the first time and light spilled across the floor exactly as imagined, it was clear: we were building something real.

It’s not about showing off technology. It’s about reminding people that the web can still surprise you.