
The Challenge
Netflix Series
For one of Edis’ special concerts at Harbiye Open-Air Theatre, we developed a real-time visual stage design that directly responded to the artist’s movement on stage. Rather than creating a standalone visual playback for the LED screens, the goal was to design a system where visuals evolved live, in sync with the performance itself.
For the selected song, a custom digital environment was modeled in Unreal Engine. A key visual element of this environment was the god rays passing through its windows, defining the atmosphere and depth of the scene. These light beams were not treated as static background visuals, but as dynamic elements whose direction and behavior were shaped by the live performance.
As Edis moved across the stage, the lighting within the digital environment reacted in real time, shifting its direction and intensity to follow his position. This created a direct connection between the physical stage and the virtual space, allowing the visuals to feel like a natural extension of the performance rather than a separate layer behind it.
Instead of relying on pre-rendered or time-based content, the system operated live throughout the song, generating constantly evolving compositions. Each moment became unique, shaped by the artist’s movement and the flow of the performance.
The result was a responsive and performance-driven visual experience where digital light, space, and motion were tightly integrated with the live concert, reinforcing the emotional impact of the moment and blurring the line between physical and virtual stage design.
Equipment
For the Harbiye concert, the Naostage K System was used to establish a real-time relationship between on-stage movement and the visual environment. The system detects performer positions using Kapta sensors based on lidar and thermal camera technology. The data is processed on the Kore workstation and managed through the Kratos software interface.
The resulting position data was transmitted as OSC data over a 10G network infrastructure into Unreal Engine. This setup enabled on-stage movement to be accurately and reliably mapped to the digital environment and lighting logic with minimal latency and high stability.
All real-time visual content was created and rendered in Unreal Engine. The system ran on a high-performance setup powered by an AMD Threadripper processor and two NVIDIA RTX 4090 GPUs, ensuring consistent performance and stability under live show conditions.
Credits
Artist: Edis
Tracking System: Naostage – K System
Real-time setup & Unreal Engine operator: Sense Department
Visual Artist: Ecem Dilan Köse
Stage Design: Ibrahim Kandemir


