If you are planning an immersive experience, the first question is almost always about time. Schedules drive venue bookings, creative approvals, and launch dates. The short answer is that you can often ship something meaningful in a few weeks when the scope is focused and the team is set up for real-time production, while larger custom worlds can take longer.
This post outlines the levers that can change timelines, utilizing only the practices discussed by the Lumen and Forge team during our recent planning session. You will see how real-time tools save days, how pre-rendered pipelines stay efficient, and how interactivity affects the calendar.
What Counts As An Immersive Experience
An immersive experience surrounds people with story and motion. It can be projection-mapped onto architecture or objects, it can live on LED walls and other displays, and it can extend into domes and full-room 360 spaces. It can also respond to people through kiosks, touch tables, or motion cameras.
Lumen and Forge build both passive and interactive formats. The team creates original 3D content, integrates client footage into wider canvases, and wires sensors to engines so scenes react in real time. The same content stack can drive projection mapping or screen-based installs.
The Biggest Timeline Levers
Three factors move the calendar more than anything else. First is the content pipeline you choose, real-time or pre-rendered, and where they blend. Second is the level of interactivity, since inputs and logic add design and testing. Third is how much of the content is new versus adapted from assets you already have.
Real-time engines allow rapid iteration and immediate visual feedback. Pre-rendered sequences provide precise control of look and are the right choice when that finish is the goal. Both approaches sit side by side in our work.
For real-time, Unreal Engine is the core. TouchDesigner and Notch help with interactive logic and live visuals, and they can read sensor data and pass clean parameters into Unreal. These tools let a small team deliver polished results quickly without giving up creative control.

3D Animation Tools That Shape Speed Without Sacrificing Quality
Modeling and animation happen in Cinema 4D and Blender. When pre-rendering is the right choice, the team renders in Octane Render or Redshift, then composites and finishes in After Effects. This supports large canvases that include domes and 360 rooms, and keeps outputs consistent.
Real Projects That Inform Timelines
Lumen and Forge built the Ghost Ship project in Unreal Engine, rendered in real time. The CAT CBC Customer Visitor Center experience was also created in Unreal Engine. These projects show how a real-time engine can anchor immersive content for brand spaces and visitor experiences while keeping schedules practical.
The broader approach is consistent. The team uses a game engine when interactivity, rapid iteration, or real-time rendering will add value. They turn to pre-rendered workflows when they look for benefits from offline rendering and when timelines or creative needs point that way.
A Modern Production Timeline
Every project follows a consistent overarching framework, even as the details evolve.
Discovery establishes clear goals, constraints, and creative intent, ensuring that every subsequent phase moves efficiently.
Content development then brings the vision to life—through pre-rendered, real-time, or hybrid assets and scenes.
Integration connects all systems, linking sensors, logic, and playback into a cohesive ecosystem.
Installation translates the digital design onto the physical canvas, aligning mapped outputs and configuring display systems.
Finally, testing and refinement ensure that every element performs seamlessly, delivering an experience that feels deliberate, immersive, and complete.
Discovery And Scoping
Discovery is where teams align on story, surfaces, and must include content. This is where the choice between real-time, pre-rendered, or a blend is made. Clear scope avoids churn and prevents late pivots.
The team also confirms whether the space supports projection mapping, a screen-based plan or any other interactive or generative technology. Constraints such as lighting and rigging shape the choice early and help lock technical assumptions.
Content Development
For pre-rendered paths, artists model and animate in Cinema 4D or Blender, render in Octane or Redshift, and composite and finish in After Effects. This gives look control and predictable outputs and is ideal when the content is set and the event plays back in a sequence.
For real-time paths, content is assembled in Unreal Engine. This allows quick changes and interactive logic. It also makes live adjustments possible during reviews so our clients can react to what they see rather than to static boards.

Interactivity And Logic
Interactivity adds inputs and states that must be designed and tested. TouchDesigner and Notch read data from kiosks, touchscreens, and depth cameras such as Azure Kinect. Those values are simplified or dampened, then used to drive parameters in Unreal or within the same tool that reads them.
Simple interactions change color, brightness, or speed. Deeper interactions can move a camera, drive an avatar with bone tracking, or trigger scene states. The logic is modular, so it can grow as needs become clear.
Working With Existing Footage
Many clients bring videos that must be included. The team preserves that footage and fits it into larger canvases without distortion. Picture-in-picture placement, frames, and soft-edge blending help the original clips feel native to the scene.
In domes and 360 spaces, the same clip can be placed in multiple locations and surrounded with complementary elements. Tone, pacing, and mood stay consistent so the room reads as one world.
Installation And Alignment
Projection mapping requires alignment and blending on the physical surfaces. Content is mapped to the geometry, then refined until edges and seams disappear. For screen-based installs, the team configures playback and ensures content scales cleanly across the defined canvas.
Once outputs are stable, the show is run in conditions that match the event. This is where finishing details and lighting choices are locked and where all final adjustments are made.
How Interactivity Changes The Calendar
Interactivity adds design choices and data handling steps. The raw signal from a sensor must be interpreted and smoothed, then mapped to a visual parameter. TouchDesigner often handles that step, and Unreal Engine applies it in the scene so the response feels immediate and intentional.
More advanced interactions such as avatar control, multi-station journeys, or personalized experiences introduce additional paths, states, and data relationships within the system. These experiences often include a digital takeaway, which necessitates a clear strategy for naming, asset management, and delivery. Technologies such as NFC, RFID, or QR codes enable continuity and persistence, allowing each guest to carry a unique profile from one station to the next and later revisit their complete experience across platforms.
Sensor And Data Path
Motion cameras and depth cameras can provide position, velocity, silhouettes, and even bone tracking. Those values can change color, brightness, object motion, or camera movement. The same approach works whether the output is a mapped object, a wall-based display, or both at once.
Multi-Station Journeys And Takeaways
Attendees can customize avatars or products, creating a personalized digital takeaway that captures their unique contribution. Technologies such as NFC, RFID, or QR codes enable journey persistence, allowing each guest’s profile to carry seamlessly across multiple stations. This continuity lets participants move freely through the experience without starting over, while also providing a shareable post-event record that extends engagement beyond the venue and deepens their sense of connection to the story.

When A Small Team Is Enough And When To Scale
A focused team can deliver something compelling in a short window. With one or two developers, Lumen and Forge can often make an interactive experience in a few weeks. When a project needs specialists, the team brings in trusted contractors to scale without losing speed.
This mix keeps budgets focused on the parts that move the needle. It also keeps momentum high, since decision makers stay close to the work and can approve in context.
Where Real-Time Engines Save Days
Real-time engines reduce the gap between idea and review. Scenes render instantly, so creative decisions happen with the actual look and movement on screen or on the physical surface. This is valuable when clients need to see changes on a mapped object or in the actual space.
Real-time also makes interactive prototypes practical. You can connect a sensor, preview responses, and refine feel in the same session. That speed compounds throughout production and lowers risk.
Where Pre-Rendered Sequences Still Win
Pre-rendered sequences are the right choice when the look demands it. Offline renders in Octane or Redshift can deliver images that are difficult to match in real time. They also support complex composites and finishing in After Effects.
This path is ideal when the story is linear and frame-level control matters. It also pairs well with interactive moments by cutting between pre-rendered set pieces and real-time responses that share the same design language.
Building an immersive experience is both a technical craft and a creative collaboration. Every project balances story, technology, and time, shaped by the tools, people, and goals behind it. At Lumen and Forge, our approach is to streamline complexity, combining real-time innovation with cinematic quality to deliver experiences that are as efficient to produce as they are powerful to behold. Whether the timeline is measured in weeks or months, our process ensures that each activation feels intentional, responsive, and unforgettable.
