Arcane Arena
Client:
Immersive Environments
Role:
Lead Technician
Purpose:
Immersive Environments, Motion Capture, Unreal Engine, Game Development
Year:
2026

Overview
Arcane Arena is a real-time, motion-capture-driven combat experience developed in Unreal Engine 5.3.2. The project’s primary objective was to synchronize high-fidelity physical performances with deterministic gameplay mechanics. This document focuses on the group’s technical contributions to the systems architecture, the digital content creation (DCC) pipeline, and the integration of virtual production hardware.
To facilitate the complex requirements of real-time motion capture and modular gameplay, my team and I utilized a professional suite of industry-standard tools:
Unreal Engine 5.3.2: The primary platform for systems integration, physics, and deployment.
Blender: Used as the primary DCC tool for skeletal mesh auditing and transform normalization.
Mixamo: Utilized for rapid animation prototyping.
Captury Live & QTM Live Link: Real-time data streaming solutions for motion capture.
MiVRy: Evaluated for machine-learning-based gesture recognition (evaluated and subsequently discarded in favor of deterministic collision triggers).
Synopsis:
The idea of the game was to set an arena where a Mage and a Mech would battle each other in a magic-infused, real-time wand fight. The characters would be motion-tracked by the actors/players in the LIM Lab, and each player would be provided with a wand prop rigged with reflective tracking points. The functionality of the game fluctuated throughout the semester as we tried to determine the best route to take in terms of fluidity and functionality while simultaneously relieving stress on the motion tracking system for the best optimization. We utilized a bounding box surrounding each player character and attached a projectile (fireball) system to the hand socket of the character model. The idea was that the motion-tracked wand would be “cast” outside of the bounding box, triggering the projectile to fire towards the direction of the wand and hand socket.
Team members & Roles:
Terrence O’Leary - Lead Programmer
Alexander Le - Lead Technician
Anoushka Chigullapalli - Damage & Health Systems
Jing ‘Mark’ Lin - Rigging & Animations
Engine-Standardized Rigging Workflow
I worked on a specialized pipeline to bridge the gap between Blender’s Rigify system and the Unreal Engine 5 bone hierarchy. By utilizing the Expy Kit plugin, I translated custom control rigs into a format compatible with the engine's native animation library. This ensured that our character models, the Mage and the Mech, could utilize standardized UE5 animations without requiring manual re-weighting for every unique asset.
IK Rig Retargeting Configuration
To achieve high-fidelity motion, I utilized Unreal Engine’s native IK Rig Retargeting system to bridge incoming motion capture data with our custom skeletal meshes. I configured custom IK Rigs to map non-standard bone chains—specifically those utilizing "DEF-" prefixes from external rigging tools—to the standard engine bone chains. This configuration enabled the seamless retargeting of animations and motion-matching logic, ensuring that real-time performances were accurately reflected on our custom skeletal meshes.
Skeletal Mesh Technical Optimization
During the import phase, I diagnosed and resolved critical vertex deformation and "mesh explosion" issues. These were identified as artifacts of inconsistent coordinate systems and transform data. I went back to Blender and normalized the root bone positioning to the world origin (0,0,0) and standardized transform scales across all armatures to ensure 1:1 parity upon engine import.
Animation-Driven Gameplay Logic
I integrated gameplay triggers directly into the animation pipeline using frame-accurate notifies (AnimNotifies). This allowed me to synchronize the physical spawning of projectiles and Niagara visual effects with specific hand gestures. By anchoring these events to the animation data, I ensured that the mechanical execution of a "spell" always matched the visual action of the character.
System Migration & Logical Refactoring
I migrated the damage and health system prototype initially created by Anoushka. The migration required the restructuring of variables and components from the health system that connected to the damage components (i.e., a collision hit will deduct x amount of points).
Decoupled Communication via Interfaces
To prevent rigid class dependencies and minimize technical debt, I expanded on a blueprint interface from Anoushka that served as a communication bridge between gameplay actors (the characters) and the User Interface (the HUD). By using interfaces, the HUD remained actor-agnostic, allowing it to poll data from any actor marked as "damageable" without the need for expensive or brittle casting operations.
Precision Projectile Firing Pipeline
I developed a projectile casting system designed to resolve the discrepancy between a character’s physical hand position and the player’s intended target. The system utilizes skeletal mesh sockets as physical muzzles while calculating trajectories via camera-space vectors. This ensures that projectiles travel accurately toward the player's world-space reticle, regardless of the character’s current animation pose.
Advanced Physics & Collision Filtering
I conducted a project-wide collision audit to resolve issues where projectiles would immediately collide with the player who fired them. By reconfiguring collision channels and instigator-ignore logic, I enabled projectiles to spawn within an actor's bounds without triggering false impacts, while still maintaining high-fidelity physical responses when hitting environmental obstacles or opponents.
Real-Time Data Stream Calibration
Working alongside Terrence, I assisted with the technical troubleshooting of the Captury and QTM Live Link tracking systems. We encountered significant spatial drift where the digital characters would desynchronize from the physical actor's position. I resolved this by implementing dynamic transform offsets within the character logic, effectively "re-zeroing" the actors within the digital environment to match the physical capture volume.
Dynamic HUD Data Binding & Technical Resilience
I developed the "Versus" HUD system, expanding on Anoushka’s HUD, which utilizes dynamic variable references to monitor the health status of the Mage and Mech simultaneously. To ensure technical resilience, I engineered robust error-handling protocols within the UI bindings. This included "Is Valid" checks and safe-math logic to prevent engine crashes in the event of rapid actor initialization or destruction.
Session Management & Race Condition Resolution
I leveraged the Level Blueprint to serve as a centralized session manager. This system identifies specific character instances upon level start and "hands off" those references to the HUD. I diagnosed and fixed critical race conditions where UI elements would attempt to access character data before it had been initialized. By structuring a sequenced initialization flow with managed delays, I ensured total data integrity across the session.


















