RealityLib – Open-Source VR Framework for Meta Quest
A lightweight, native-C VR framework that extends RayLib with OpenXR support, letting developers build VR games for Meta Quest devices by only modifying a single file. Built as a Rose-Hulman CSSE Capstone project, RealityLib lowers the barrier to VR development without sacrificing power or flexibility.

Project Overview
Related Fields
VR/AR Development, Game Frameworks, Systems Programming, OpenXR
Tech Stack
Native C, OpenXR, OpenGL ES 3.0, Android NDK, Gradle, CMake
Project Status
CSSE Capstone (2025–2026), active development — Vulkan integration planned
Team & Role
Team Members
- Harrison Carpenter
- Irvan Wang
- Wenbo Yang
- Yiheng(Ian) Zhu
Client
Steve Hoelle
Capstone Advisor
Dr. Kim Tracy
Motivation & Problem Statement
VR development in native C is largely underserved. Most pathways force developers to choose between steep game-engine learning curves (Unity, Unreal) or diving directly into the raw OpenXR spec — which is primarily documented in C++.
RealityLib asks a focused question: Can we bring RayLib's philosophy — simple, readable, beginner-friendly — into the world of VR? The framework targets two groups: mobile game developers already familiar with RayLib who want to expand into VR, and educators and students looking for a low-friction entry point into spatial computing.
- Open-source and educational by design
- Developers only need to modify
main.cto create a full VR experience - Framework (not engine) — core VR primitives, no bloat
- Optimized specifically for Meta Quest 3/3S hardware
Architecture & Design
RealityLib is layered cleanly between the user application and the underlying OpenXR runtime:
- User Application (
main.c): Developers call RayLib-style functions likeDrawVRCube(),DrawVRLine3D(), andDrawVRGrid()— no OpenXR knowledge required. - RealityLib VR Layer (
realitylib_vr.c/h): Abstracts OpenXR complexity, manages the session lifecycle, stereo swapchains, controller input, haptics, and head tracking. - Command Buffer System: Draw calls issued between
BeginVRMode()andEndVRMode()are stored in a buffer and replayed for each eye — solving stereo rendering without any extra work from the developer. - Hand Tracking Module (
realitylib_hands.c/h): Optional extension providing full skeletal hand tracking with 26 joints per hand and built-in gesture detection (pinch, fist, point).
The build system uses a hybrid Gradle + CMake pipeline targeting Android NDK r29 with the Meta OpenXR AAR for loader distribution — keeping setup reproducible across machines.
Core Features
VR Rendering
- Stereo rendering for both eyes via OpenXR swapchain management
- Frame synchronization at 72 FPS on Meta Quest 3/3S
- Drawing primitives: cubes, cuboids, lines, grids, planes, and coordinate axes
- OpenGL ES 3.0 shaders with custom geometry management
Controller Input
- Full 6DoF controller tracking (position and orientation)
- Analog inputs: triggers (0–1), grip (0–1), thumbstick axes
- Digital buttons: A, B, X, Y, thumbstick click
- Haptic feedback with configurable amplitude and duration
- Locomotion system: smooth movement, snap turning, sprint, jump, and fly mode
Hand Tracking
- Full skeletal hand tracking — 26 joints per hand via OpenXR extension
- Gesture detection: pinch (with strength 0–1), fist, pointing, open hand
- Visualization helpers:
DrawHandSkeleton(),DrawHandJoints() - Graceful fallback — initializes only when hardware supports it
Build & Deployment
- Pure native C —
android:hasCode="false", no Java required - Single Gradle task builds and signs a deployable APK
- ADB-based deployment pipeline with log filtering helpers
- VSCode / Cursor integration via
c_cpp_properties.jsonfor IntelliSense
Demo Game: Cube Slice VR
To validate the framework, the team designed and built Cube Slice VR — a fast-paced VR arcade game inspired by Fruit Ninja. Players use a virtual blade to slice floating Rubik's-Cube-style objects mid-air, with a combo mechanic that rewards juggling cubes before slicing them.
- Slice: Use the virtual blade to cut cubes in mid-air with satisfying physical mesh separation
- Flip / Juggle: Gently strike a cube without slicing to flip it upward, building a score multiplier
- Combo Scoring: Each flip before a slice increases the bonus multiplier — rewarding skill and timing
- Clean geometric visual style with bright audio and visual feedback for high-combo moments
The game served as the primary stress test for RealityLib's rendering, collision, and controller systems, driving several architecture refinements through a series of VR-specific game jams.
Development Journey
The project followed an iterative three-phase approach over the 2025–2026 academic year:
Fall Quarter — Learning & Exploration
- Individual game jams to master RayLib fundamentals and explore OpenXR specs
- Early RayLib demos running on Android, then on the Quest headset
- Discovered that the previous team's codebase had compatibility issues with updated dependencies — decision made to rebuild from scratch
Winter Quarter — Core Development
- Rebuilt RealityLib with redesigned architecture and updated OpenXR integration
- Resolved critical build issues: linker errors, ClassNotFoundException from AndroidX conflicts, swapchain acquire/release ordering bugs, and the ALooper_pollAll deprecation in NDK 35
- Successfully demoed a running VR scene on Meta Quest 3S at 72 FPS
- Began designing Cube Slice VR for the CS Expo
Spring Quarter — Polish & Research
- Completed demo game with controller and hand tracking support
- Usability testing and feedback collection
- Poster presentation at Rose Show, final report, and public release preparation
Lessons Learned
- API simplicity is a feature, not a convenience. Keeping draw calls RayLib-idiomatic dramatically reduced the onboarding friction for new contributors and testers.
- Scope discipline matters more in frameworks than in apps. Every feature pulled toward game-engine territory had to be deliberately declined to preserve the framework's identity.
- Iterative game jams are better than upfront design. The most impactful architecture decisions — the command buffer system, the hand tracking module split — emerged from building real things, not planning sessions.
- Prior team codebases can be a liability. Spending weeks debugging incompatible dependencies cost more time than rebuilding from scratch would have. Recognizing that inflection point earlier is a key takeaway.
- Native C is viable for modern VR — but build system complexity (NDK versions, AAR extraction, CMake flags) is where most of the friction lives, not the VR logic itself.