My Role
Lead UX & Immersive Systems Designer
Core Contributions
-Designed a dual-input interaction model enabling learners to engage via voice or silent, preset controls—supporting accessibility, privacy, and varied learning contexts.
-Defined and authored the core multimodal interaction language (voice, gesture, spatial UI), establishing consistent behavioral patterns across AI-driven immersive learning modules under strict usability, accessibility, and latency constraints.
-Led end-to-end UX design for an AI-powered VR learning system that responds to learner input through a conversational AI interface embedded within immersive scenarios. The platform integrates conversational AI with real-time game engine constraints to deliver guided, personalized STEM instruction in 3D environments.
-Prototyped and tested immersive scenarios combining voice interaction, natural movement, and domain reasoning to support transfer from simulated environments to real-world problem solving.
-Aligned AI, ethics, and product teams around shared design constraints, including explainability, learner trust, and pedagogical integrity.
Constraints & Tradeoffs
-Designed for variable hardware performance and inconsistent network conditions
-Balanced AI-driven adaptivity with predictable learner mental models
-Avoided opaque personalization to preserve trust and instructional clarity
-Limited interaction complexity to reduce cognitive overload in immersive 3D space
Special thanks to SciGence CEO Vicente Navarro and advisor Patrick Cross.
Outcome
Designing a seamless user flow while maintaining AI-driven adaptability was a core challenge. I collaborated closely with researchers and development teams to streamline interaction points and maintain clarity in immersive space.
The platform delivered tailored learning modules across STEM topics, with early pilots indicating increased learner engagement and retention relative to baseline remote instruction.