Jing Chung

XReveal

Real-time Collaborative XR Storyboard

About the Project

As the lead programmer, I contributed to XReveal, a hackathon project created by CMU ETC students during winter break. The hackathon was organized by Immersive Insiders.


XReveal brings ideas to life by generating 2D and 3D assets through voice commands. Powered by AI, it enables anyone—regardless of design or artistic skills—to visually share their thoughts. This XR tool addresses common collaboration challenges by aligning ideas in real-time, streamlining communication, reducing meeting times, and making teamwork more efficient and seamless.

Speak to Create

By integrating OpenAI's record-to-transcript API, XReveal empowers users to create 2D and 3D assets simply by speaking, allowing them to visually convey complex concepts for easier understanding.

Building a Request-Based System with Meshy's APIs

Learning from OpenAI's REST web API structure, I wrapped Meshy's API into a request-based class and built a flow for sending requests and pulling the status of those requests to construct this system.

Custom Meta Quest Control System

Leveraging the Meta SDK's hand interaction capabilities, I designed and implemented a multi-hand gesture control system. This solution empowers users to interact with models using two-hand gestures, facilitating tasks such as scaling, moving, and multi-object selection for seamless, synchronized adjustments.

— PROJECT NAME

XReveal


— ROLE

Programmer


— DATE

Jan, 2025

My Contribution

  1. 1. Integrated Meshy API with Unity using REST requests in C#.
  2. 2. Developed a custom multi-hand gesture control for Meta Quest.
  3. 3. Built the UI system for an intuitive user experience.
  4. 4. Designed and developed the system architecture to seamlessly combine all features.