what we’re building (reality check)
A real holosuite is a room-scale, multi-user, multi-sensory environment that blends:
-
visuals (near-eye now; light-field walls later),
-
natural locomotion (omni-floor/treadmill),
-
mid-air haptics + environmental cues (wind, heat, scent),
-
spatial audio,
-
ultra-low latency tracking & rendering,
-
a content pipeline (fast world capture + simulation).
success hinges on latency (<~20 ms motion-to-photon), correct depth cues, and convincing multisensory alignment. PMC+1
bill of materials (bom)
A) visuals
Path A (now): 2–4 high-end MR/VR headsets (eye-tracked, 90–120 Hz).
Path B (upgrade): tileable light-field displays (group-view 3D; up to ~100 views) for “helmet-off” shared scenes; keep headsets for close work. lookingglassfactory.com+2lookingglassfactory.com+2
B) locomotion
-
Omnidirectional floor/treadmill. Disney’s HoloTile proves multi-user omni-floor feasibility (research/demonstrator). Commercial alt: curved-shoe or ring-rail omni treadmills. YouTube
C) haptics & atmosphere
-
Mid-air ultrasound haptics module(s) for touchable buttons/surfaces in free air (Ultraleap dev units). Ultraleap Documentation+1
-
Optional acoustic “hologram” array for levitation/advanced effects (lab-grade research kit only). Nature+1
-
Wearable vibro/force bands; fans, heaters, cool mist; scent micro-emitters.
D) spatial audio
-
Beamformed speaker array or high-order ambisonics with individualized HRTFs; sync to head/eye pose. (Vision-class spatial audio is a good reference target.) YouTube
E) tracking & compute
-
Ceiling/floor depth cams + IMUs (OpenXR/SteamVR tracking), fast steering if using projection.
-
Compute: 1–2 multi-GPU boxes (per 2 users) to keep motion-to-photon under ~20 ms. PMC
F) content pipeline
-
3D Gaussian Splatting toolchain (phone/drone capture → splat training → real-time render). Use the paper + official repo. GitHub+3arXiv+3ACM Digital Library+3
-
Game engine: Unity/Unreal + OpenXR.
-
AI agents (NPCs), physics, interaction system.
G) room shell & safety
-
Acoustic treatment (RT60 ≲ 300 ms), blackout, HVAC, UPS, E-stop buttons, soft walls/rails near locomotion.
-
Clear exposure limits for ultrasound, safe projector/laser classes, and scent allergen policy. (Follow device safety docs.) Ultraleap Documentation
floor plan (8–16 m²)
-
Front wall: (future) light-field tiles; behind-wall cable trough.
-
Ceiling: depth cams, audio array, light bar; vented plenum.
-
Center: omni-floor/treadmill with safety rail & E-stop.
-
Perimeter: ultrasound haptics towers at ~waist/shoulder height.
-
Rack: compute/UPS/network + scent/wind modules; separate “quiet corner” for compressors.
core equations (targets & tuning)
-
motion-to-photon latency
for comfort. PMC -
light-field sampling (group 3D walls): aim ≥60 pixels/deg and dozens of views to reduce aliasing & VAC; commercial panels offer up to ~100 views. lookingglassfactory.com+1
-
mid-air ultrasound pressure (qualitative): radiation pressure . Use vendor tools to keep within skin-safe limits. Ultraleap Documentation
step-by-step build (phased)
phase 0 — room prep (1–2 weeks)
-
Acoustic panels (walls/ceiling), blackout curtains, dedicated circuits, 20A outlets.
-
Network (Cat6/6a), ceiling mounts for sensors, cable trays.
-
Install UPS + surge, set E-stop box reachable from treadmill center.
phase 1 — core immersion (2–4 weeks)
-
Headsets + base stations/inside-out tracking; calibrate guardian/chaperone.
-
Compute: set up OpenXR; measure MTP latency with test tools; tune to ≤20 ms. PMC
-
Spatial audio: set speakers/ambisonics; load personalized HRTFs where possible. YouTube
-
Environmental cues: add fans/heat; tie to engine events (e.g., wind zones).
phase 2 — mid-air haptics (2–3 weeks)
-
Install Ultraleap haptics unit(s) at waist height; USB/serial to host.
-
Calibrate focal points to tracked hands; render “virtual buttons,” surfaces, and textures; gate intensity with safety API. Ultraleap Documentation
phase 3 — locomotion (4–8 weeks)
-
Install omni-treadmill (or preorder omni-floor when available).
-
Integrate control: keep user near room center while world movement matches gait; verify stopping within <300 ms & add fall-prevention rail. (HoloTile demos show multi-user feasibility.)
phase 4 — group view walls (pilot)
-
Mount light-field display panels; set viewing zone.
-
Sync engine cameras to panels for spectators/users without headsets (100-view tech enables multi-viewer 3D). lookingglassfactory.com+1
phase 5 — content pipeline (ongoing)
-
Capture worlds: shoot a space with a phone/drone; process into 3D Gaussian splats (minutes) and stream in-engine at 60–200 FPS. arXiv+1
-
Add NPCs (LLM + behavior trees), physics, interactions; bake spatial audio paths.
-
Build a library of “experiences” (scenes + haptics + atmosphere + locomotion scripts).
phase 6 — QA & comfort
-
Iterate until: MTP ≤ 20 ms, PPD ≥ 35–60, RT60 ≲ 300 ms, haptic patterns are vivid but within vendor exposure limits. PMC+1
costs (very rough 2025 ranges)
-
Starter (2 users, headsets only): $8k–$20k (HMDs, PC, audio, fans, build-out).
-
With mid-air haptics & treadmill: $25k–$80k.
-
With light-field wall + multi-user locomotion: $120k–$500k (panels + pro integration).
what to build first (minimal viable holosuite)
-
Headsets + spatial audio + fans/heat (Phase 1)
-
One Ultraleap haptics array (Phase 2) Ultraleap Documentation
-
A safe omni-treadmill (Phase 3)
-
Content via Gaussian splats (Phase 5). arXiv
This already feels shockingly close to a “holosuite.” Then add light-field walls later for the “take the helmet off” moment. lookingglassfactory.com
references worth skimming
-
3D Gaussian Splatting (paper + site + repo) for instant photoreal worlds. GitHub+3arXiv+3ACM Digital Library+3
-
Ultraleap mid-air haptics docs (principles, SDK). Ultraleap Documentation+1
-
Light-field displays (100-view, group 3D). lookingglassfactory.com+2lookingglassfactory.com+2
-
Omnidirectional floor (HoloTile) demos as proof of feasibility. YouTube
-
VR motion-to-photon latency methods & importance. PMC+1
Bill of Materials (BOM) checklist with categories and suggested components.
-
Room wiring + layout diagram (text description).
-
Commissioning script: step-by-step calibration & testing once everything is installed.
🛠️ 1. Bill of Materials (BOM)
A) Visuals
-
Now:
-
2–4 × High-end MR/VR headsets (eye tracking, 90–120 Hz) → e.g., Apple Vision Pro, Varjo XR-4, or Meta Quest Pro.
-
Tracking base stations (if not inside-out).
-
-
Future upgrade:
-
2–3 × Light-field display panels (e.g., Looking Glass 65" class or equivalent multi-view tiled systems).
-
B) Locomotion
-
1 × Omnidirectional treadmill or equivalent (Virtuix Omni, Kat Walk, Infinadeck).
-
Safety rail, harness, and E-stop.
-
Future: Disney HoloTile-like omni-floor (when available).
C) Haptics & Atmosphere
-
1–2 × Ultraleap STRATOS Explore or equivalent mid-air ultrasound haptics dev kit.
-
2 × Wearable haptic vests/gloves (bHaptics, TESLASUIT lite).
-
3 × Floor fans (PWM controllable).
-
2 × IR heat lamps (dimmable).
-
1 × Ultrasonic humidifier (for mist).
-
1 × Scent diffusion kit (programmable cartridges).
D) Audio
-
8–12 × Compact full-range speakers (arranged hemispherically).
-
2 × Subwoofers (floor mounted).
-
Audio interface with ambisonic decoding (Focusrite, RME).
E) Tracking & Compute
-
4 × Ceiling-mounted depth cameras (Azure Kinect, Intel RealSense, or OptiTrack).
-
IMUs in headsets and treadmill.
-
Compute server:
-
2 × Nvidia RTX 5090 (or latest).
-
128 GB RAM.
-
PCIe Gen5 SSD (8 TB).
-
UPS + surge protection.
-
F) Content Pipeline
-
Unity/Unreal engine (OpenXR support).
-
Gaussian Splatting toolkit (repo + CUDA build).
-
NPC AI stack: LLM + behavior tree system.
G) Room & Safety
-
12–16 m² enclosed space.
-
Acoustic wall panels (RT60 ≲ 300 ms).
-
Cable trays, blackout curtains.
-
HVAC upgrade (extra 3–5 kW cooling).
-
Emergency lighting + E-stop buttons on walls and treadmill.
📐 2. Room Wiring & Layout (text diagram)
⚙️ 3. Commissioning Script
Step 1: Power & Safety
-
Verify dedicated 20A circuits for compute + treadmill.
-
Connect all gear to UPS → check runtime ≈ 10–15 minutes.
-
Test E-stop buttons → treadmill, power relays shut down.
Step 2: Tracking Calibration
-
Mount ceiling depth cams → measure overlap fields.
-
Run SLAM calibration → sync with treadmill’s IMU.
-
Align headset coordinate system to treadmill center (offset < 1 cm).
Step 3: Visuals
-
Connect headsets → run OpenXR diagnostics.
-
Measure motion-to-photon latency (goal ≤ 20 ms).
-
Foveated rendering enabled; confirm frame rates (90–120 Hz).
Step 4: Audio
-
Place ambisonic mic at treadmill center.
-
Run calibration sweep → tune EQ/delays.
-
Verify localization error < 5°.
Step 5: Haptics
-
Position ultrasound arrays ~80 cm from user.
-
Run SDK focal point test (buttons, sliders).
-
Ensure skin pressure within vendor safety (100–300 mN fingertips).
Step 6: Atmosphere
-
Integrate fans/heat/scent into engine.
-
Run scripted cues (wind gust, campfire heat, rainforest humidity).
Step 7: Locomotion
-
Enable treadmill → tune control loop.
-
Test walking speeds 0.5–2 m/s; verify braking < 300 ms.
-
Confirm center error < 0.5 m over 5 minutes.
Step 8: Content Pipeline
-
Capture a room via smartphone drone → train Gaussian Splat scene.
-
Import into engine; verify 60–200 FPS render.
-
Spawn AI NPC; confirm speech + behavior tree runs in sync with world.
Step 9: Multi-user Sync
-
Add 2nd headset; check frame sync < 10 ms drift.
-
Validate audio/visual coherence for both users.
✅ Minimum Viable Holosuite (first build target)
-
1 treadmill + 2 headsets.
-
Spatial audio + fans/heat.
-
1 ultrasound haptics array.
-
Gaussian Splat scene running at ≥90 Hz.
This alone gives you the “wow” factor. Then add light-field walls + multi-user locomotion for full Trek-style immersion.
Holosuite Shopping List with Links (BOM + Purchases)
Here are components you can buy right now (or preorder) to build the holosuite.
Here are some top picks:
-
Meta Quest Pro VR Headset — one strong mixed-reality/VR headset with passthrough and good optics. Great for initial visual immersion.
-
Pimax Crystal Light VR Headset — very high clarity (PPD) headset; good if visual fidelity is a priority.
-
KAT Walk C2+ VR Treadmill — a quality omni-directional treadmill for walking in VR with safety rails.
-
KAT WALK C2 Core VR Treadmill — simpler / more compact system for locomotion.
And from the non-product pages:
-
Light-Field Display, 16-inch by Looking Glass (~US$3,000) for group/shared 3D visuals without headsets. lookingglassfactory.com
-
Light Field Display, 27-inch, pro version for workplace / larger immersive display. lookingglassfactory.com+1
-
Ultraleap STRATOS Explore Development Kit (mid-air haptics) to get tactile feedback in the suite. RobotShop USA+1
-
Ultraleap Hand Tracking Kit for Pico Neo 3 Pro (Stereo IR170) — for hand tracking improvements. unboundxr.com
📐 Room Wiring & Layout Diagram (Text + Suggested Layout)
Here’s a sketch / description you can convert into a diagram:
Electrical layout:
-
Dedicated high-capacity outlets (20-30 Amp) for compute rack and treadmill.
-
Power distribution with surge protection & UPS.
-
Cable trays overhead for sensors & cables.
-
Network switch in rack; wired connections for syncing/displays.
🧪 Commissioning / Calibration Checklist
Here’s what to test once everything is installed. Use this script to validate performance.
Test | Goal / Metric | Pass / Fail Criteria |
---|---|---|
Power & Safety | E-stop works; UPS holds load; treadmill power on/off works cleanly | All emergency stops cut power; no voltage drop; UPS supports shutdown safely |
Visual Latency | Motion-to-photon (head motion → display reaction) ≤ 20 ms | Use a latency test-tool and slow-motion / high-fps camera to measure; adjust frames & drivers |
Resolution & Clarity | PPD (Pixels per degree) realistic, sharp visuals; minimal aliasing | Headset displays and light-field panels look clean at typical viewing distance |
Tracking Accuracy | Hand & head tracking drift < ~1 cm over 5 min; cameras aligned | Walk/gait test; reach test; calibrate sensors |
Haptics | Ultrasound focal points feel “touch” without pain; safe exposure | Run sample apps; measure pressure per vendor guidelines; ensure safety cutoffs |
Locomotion | Treadmill responds to walking; acceleration & deceleration smooth; stop latency < ~300 ms | Walk in place; test emergency stop |
Audio | Spatialization works; sound sources localize correctly; RT60 < 300 ms | Place sound at directions; test ambient rejection; do audio calibration |
Environment Cues | Temperature/fans/scent sync with scene; no distracting leaks / delays | Trigger cues & observe; adjust delays and fan speeds |
Multi-User Sync | Two or more users see consistent world; drift/sync error < ~10 ms | Run shared scene tests; compare visuals/audio from both users |