Showing posts with label Step by step guide. Show all posts
Showing posts with label Step by step guide. Show all posts

Friday, September 12, 2025

Build a Holosuite Step by Step

 

what we’re building (reality check)

A real holosuite is a room-scale, multi-user, multi-sensory environment that blends:

  • visuals (near-eye now; light-field walls later),

  • natural locomotion (omni-floor/treadmill),

  • mid-air haptics + environmental cues (wind, heat, scent),

  • spatial audio,

  • ultra-low latency tracking & rendering,

  • a content pipeline (fast world capture + simulation).

success hinges on latency (<~20 ms motion-to-photon), correct depth cues, and convincing multisensory alignment. PMC+1


bill of materials (bom)

A) visuals

Path A (now): 2–4 high-end MR/VR headsets (eye-tracked, 90–120 Hz).
Path B (upgrade): tileable light-field displays (group-view 3D; up to ~100 views) for “helmet-off” shared scenes; keep headsets for close work. lookingglassfactory.com+2lookingglassfactory.com+2

B) locomotion

  • Omnidirectional floor/treadmill. Disney’s HoloTile proves multi-user omni-floor feasibility (research/demonstrator). Commercial alt: curved-shoe or ring-rail omni treadmills. YouTube

C) haptics & atmosphere

  • Mid-air ultrasound haptics module(s) for touchable buttons/surfaces in free air (Ultraleap dev units). Ultraleap Documentation+1

  • Optional acoustic “hologram” array for levitation/advanced effects (lab-grade research kit only). Nature+1

  • Wearable vibro/force bands; fans, heaters, cool mist; scent micro-emitters.

D) spatial audio

  • Beamformed speaker array or high-order ambisonics with individualized HRTFs; sync to head/eye pose. (Vision-class spatial audio is a good reference target.) YouTube

E) tracking & compute

  • Ceiling/floor depth cams + IMUs (OpenXR/SteamVR tracking), fast steering if using projection.

  • Compute: 1–2 multi-GPU boxes (per 2 users) to keep motion-to-photon under ~20 ms. PMC

F) content pipeline

  • 3D Gaussian Splatting toolchain (phone/drone capture → splat training → real-time render). Use the paper + official repo. GitHub+3arXiv+3ACM Digital Library+3

  • Game engine: Unity/Unreal + OpenXR.

  • AI agents (NPCs), physics, interaction system.

G) room shell & safety

  • Acoustic treatment (RT60 ≲ 300 ms), blackout, HVAC, UPS, E-stop buttons, soft walls/rails near locomotion.

  • Clear exposure limits for ultrasound, safe projector/laser classes, and scent allergen policy. (Follow device safety docs.) Ultraleap Documentation


floor plan (8–16 m²)

  • Front wall: (future) light-field tiles; behind-wall cable trough.

  • Ceiling: depth cams, audio array, light bar; vented plenum.

  • Center: omni-floor/treadmill with safety rail & E-stop.

  • Perimeter: ultrasound haptics towers at ~waist/shoulder height.

  • Rack: compute/UPS/network + scent/wind modules; separate “quiet corner” for compressors.


core equations (targets & tuning)

  • motion-to-photon latency
    ttotal=ttracking+trender+tscanout+tdisplay    <20 mst_{\rm total}=t_{\rm tracking}+t_{\rm render}+t_{\rm scanout}+t_{\rm display}\;\rightarrow\;<20\text{ ms} for comfort. PMC

  • light-field sampling (group 3D walls): aim ≥60 pixels/deg and dozens of views to reduce aliasing & VAC; commercial panels offer up to ~100 views. lookingglassfactory.com+1

  • mid-air ultrasound pressure (qualitative): radiation pressure F2αIcF \propto \frac{2\alpha I}{c}. Use vendor tools to keep within skin-safe limits. Ultraleap Documentation


step-by-step build (phased)

phase 0 — room prep (1–2 weeks)

  1. Acoustic panels (walls/ceiling), blackout curtains, dedicated circuits, 20A outlets.

  2. Network (Cat6/6a), ceiling mounts for sensors, cable trays.

  3. Install UPS + surge, set E-stop box reachable from treadmill center.

phase 1 — core immersion (2–4 weeks)

  1. Headsets + base stations/inside-out tracking; calibrate guardian/chaperone.

  2. Compute: set up OpenXR; measure MTP latency with test tools; tune to ≤20 ms. PMC

  3. Spatial audio: set speakers/ambisonics; load personalized HRTFs where possible. YouTube

  4. Environmental cues: add fans/heat; tie to engine events (e.g., wind zones).

phase 2 — mid-air haptics (2–3 weeks)

  1. Install Ultraleap haptics unit(s) at waist height; USB/serial to host.

  2. Calibrate focal points to tracked hands; render “virtual buttons,” surfaces, and textures; gate intensity with safety API. Ultraleap Documentation

phase 3 — locomotion (4–8 weeks)

  1. Install omni-treadmill (or preorder omni-floor when available).

  2. Integrate control: keep user near room center while world movement matches gait; verify stopping within <300 ms & add fall-prevention rail. (HoloTile demos show multi-user feasibility.)

phase 4 — group view walls (pilot)

  1. Mount light-field display panels; set viewing zone.

  2. Sync engine cameras to panels for spectators/users without headsets (100-view tech enables multi-viewer 3D). lookingglassfactory.com+1

phase 5 — content pipeline (ongoing)

  1. Capture worlds: shoot a space with a phone/drone; process into 3D Gaussian splats (minutes) and stream in-engine at 60–200 FPS. arXiv+1

  2. Add NPCs (LLM + behavior trees), physics, interactions; bake spatial audio paths.

  3. Build a library of “experiences” (scenes + haptics + atmosphere + locomotion scripts).

phase 6 — QA & comfort

  • Iterate until: MTP ≤ 20 ms, PPD ≥ 35–60, RT60 ≲ 300 ms, haptic patterns are vivid but within vendor exposure limits. PMC+1


costs (very rough 2025 ranges)

  • Starter (2 users, headsets only): $8k–$20k (HMDs, PC, audio, fans, build-out).

  • With mid-air haptics & treadmill: $25k–$80k.

  • With light-field wall + multi-user locomotion: $120k–$500k (panels + pro integration).


what to build first (minimal viable holosuite)

  • Headsets + spatial audio + fans/heat (Phase 1)

  • One Ultraleap haptics array (Phase 2) Ultraleap Documentation

  • A safe omni-treadmill (Phase 3)

  • Content via Gaussian splats (Phase 5). arXiv

This already feels shockingly close to a “holosuite.” Then add light-field walls later for the “take the helmet off” moment. lookingglassfactory.com


references worth skimming



  1. Bill of Materials (BOM) checklist with categories and suggested components.

  2. Room wiring + layout diagram (text description).

  3. Commissioning script: step-by-step calibration & testing once everything is installed.


🛠️ 1. Bill of Materials (BOM)

A) Visuals

  • Now:

    • 2–4 × High-end MR/VR headsets (eye tracking, 90–120 Hz) → e.g., Apple Vision Pro, Varjo XR-4, or Meta Quest Pro.

    • Tracking base stations (if not inside-out).

  • Future upgrade:

    • 2–3 × Light-field display panels (e.g., Looking Glass 65" class or equivalent multi-view tiled systems).

B) Locomotion

  • 1 × Omnidirectional treadmill or equivalent (Virtuix Omni, Kat Walk, Infinadeck).

  • Safety rail, harness, and E-stop.

  • Future: Disney HoloTile-like omni-floor (when available).

C) Haptics & Atmosphere

  • 1–2 × Ultraleap STRATOS Explore or equivalent mid-air ultrasound haptics dev kit.

  • 2 × Wearable haptic vests/gloves (bHaptics, TESLASUIT lite).

  • 3 × Floor fans (PWM controllable).

  • 2 × IR heat lamps (dimmable).

  • 1 × Ultrasonic humidifier (for mist).

  • 1 × Scent diffusion kit (programmable cartridges).

D) Audio

  • 8–12 × Compact full-range speakers (arranged hemispherically).

  • 2 × Subwoofers (floor mounted).

  • Audio interface with ambisonic decoding (Focusrite, RME).

E) Tracking & Compute

  • 4 × Ceiling-mounted depth cameras (Azure Kinect, Intel RealSense, or OptiTrack).

  • IMUs in headsets and treadmill.

  • Compute server:

    • 2 × Nvidia RTX 5090 (or latest).

    • 128 GB RAM.

    • PCIe Gen5 SSD (8 TB).

    • UPS + surge protection.

F) Content Pipeline

  • Unity/Unreal engine (OpenXR support).

  • Gaussian Splatting toolkit (repo + CUDA build).

  • NPC AI stack: LLM + behavior tree system.

G) Room & Safety

  • 12–16 m² enclosed space.

  • Acoustic wall panels (RT60 ≲ 300 ms).

  • Cable trays, blackout curtains.

  • HVAC upgrade (extra 3–5 kW cooling).

  • Emergency lighting + E-stop buttons on walls and treadmill.


📐 2. Room Wiring & Layout (text diagram)

[Front Wall] ├── Light-field panel array (future) ├── Speaker cluster (L, R, Center, Height) └── Cable trough behind wall [Ceiling] ├── 4 × depth cameras (corners, angled inward) ├── 6 × compact speakers (spatial array) ├── Light bar + IR markers └── Ventilation plenum [Center] ├── Omni-treadmill (safety rail + harness) ├── Haptics towers (ultrasound arrays waist-high) └── Overhead harness anchor (optional) [Perimeter] ├── Fans & IR lamps (N, S, E, W sides) ├── Mist + scent emitters near floor vents └── Subwoofers in front/rear corners [Rack Corner] ├── Compute server + UPS ├── Audio interface & amplifiers └── Network switch + cooling

⚙️ 3. Commissioning Script

Step 1: Power & Safety

  • Verify dedicated 20A circuits for compute + treadmill.

  • Connect all gear to UPS → check runtime ≈ 10–15 minutes.

  • Test E-stop buttons → treadmill, power relays shut down.

Step 2: Tracking Calibration

  • Mount ceiling depth cams → measure overlap fields.

  • Run SLAM calibration → sync with treadmill’s IMU.

  • Align headset coordinate system to treadmill center (offset < 1 cm).

Step 3: Visuals

  • Connect headsets → run OpenXR diagnostics.

  • Measure motion-to-photon latency (goal ≤ 20 ms).

  • Foveated rendering enabled; confirm frame rates (90–120 Hz).

Step 4: Audio

  • Place ambisonic mic at treadmill center.

  • Run calibration sweep → tune EQ/delays.

  • Verify localization error < 5°.

Step 5: Haptics

  • Position ultrasound arrays ~80 cm from user.

  • Run SDK focal point test (buttons, sliders).

  • Ensure skin pressure within vendor safety (100–300 mN fingertips).

Step 6: Atmosphere

  • Integrate fans/heat/scent into engine.

  • Run scripted cues (wind gust, campfire heat, rainforest humidity).

Step 7: Locomotion

  • Enable treadmill → tune control loop.

  • Test walking speeds 0.5–2 m/s; verify braking < 300 ms.

  • Confirm center error < 0.5 m over 5 minutes.

Step 8: Content Pipeline

  • Capture a room via smartphone drone → train Gaussian Splat scene.

  • Import into engine; verify 60–200 FPS render.

  • Spawn AI NPC; confirm speech + behavior tree runs in sync with world.

Step 9: Multi-user Sync

  • Add 2nd headset; check frame sync < 10 ms drift.

  • Validate audio/visual coherence for both users.


✅ Minimum Viable Holosuite (first build target)

  • 1 treadmill + 2 headsets.

  • Spatial audio + fans/heat.

  • 1 ultrasound haptics array.

  • Gaussian Splat scene running at ≥90 Hz.

This alone gives you the “wow” factor. Then add light-field walls + multi-user locomotion for full Trek-style immersion.


 Holosuite Shopping List with Links (BOM + Purchases)

Here are components you can buy right now (or preorder) to build the holosuite.

Meta Quest Pro VR Headset
balanced all‑rounder
Meta Quest Pro VR Headset
$849.00
Walmart + others
3.9
Pimax Crystal Light VR Headset
retina clarity
Pimax Crystal Light VR Headset
$599.00
pimax store
3.6
KAT Walk C2+ VR Treadmill
locomotion‑floor
KAT Walk C2+ VR Treadmill
$1,499.00
KATVR + others
3.7
Fit Immersion PRO‑2X VR Headset
enterprise spec
Fit Immersion PRO‑2X VR Headset
$650.00
North Pole Engineering
KAT WALK C2 Core Treadmill
budget omni‑treadmill
KAT WALK C2 Core Treadmill
$999.00
Knoxlabs
4.1
Virtuix Omni VR Treadmill Full Set
omni full system
Virtuix Omni VR Treadmill Full Set
$1,850.00
eBay
HTC Vive Flow Headset
compact form factor
HTC Vive Flow Headset
$399.95
B&H Used Store
3.5
Pimax Crystal Super VR Headset
super high PPD
Pimax Crystal Super VR Headset
$799.00
pimax store
ChatGPT chooses products independently. Learn more

Here are some top picks:

  • Meta Quest Pro VR Headset — one strong mixed-reality/VR headset with passthrough and good optics. Great for initial visual immersion.

  • Pimax Crystal Light VR Headset — very high clarity (PPD) headset; good if visual fidelity is a priority.

  • KAT Walk C2+ VR Treadmill — a quality omni-directional treadmill for walking in VR with safety rails.

  • KAT WALK C2 Core VR Treadmill — simpler / more compact system for locomotion.

And from the non-product pages:

  • Light-Field Display, 16-inch by Looking Glass (~US$3,000) for group/shared 3D visuals without headsets. lookingglassfactory.com

  • Light Field Display, 27-inch, pro version for workplace / larger immersive display. lookingglassfactory.com+1

  • Ultraleap STRATOS Explore Development Kit (mid-air haptics) to get tactile feedback in the suite. RobotShop USA+1

  • Ultraleap Hand Tracking Kit for Pico Neo 3 Pro (Stereo IR170) — for hand tracking improvements. unboundxr.com


📐 Room Wiring & Layout Diagram (Text + Suggested Layout)

Here’s a sketch / description you can convert into a diagram:

┌───────────────────────────────┐ │ Light-Field Panel(s) │ ← Front Wall (if used)Speakers + Display Array │ └───────────────────────────────┘ Ceiling: Depth Cameras (4 corners), Ceiling Speakers, Ventilation ┌────────── Treadmill ─────────┐ │ Center Zone │ │ Omni-treadmill + Safety │ │ Harness / Rail / E-Stop │ └──────────────────────────────┘ Walls: Haptics Towers at waist/shoulder height along sides Back Wall or Side: Compute Rack (GPUs, UPS), Audio Interface, Cabling Floor: Fan/Heat Lamps, Scent Emitters under vents Perimeter Speakers, Subwoofers in corners

Electrical layout:

  • Dedicated high-capacity outlets (20-30 Amp) for compute rack and treadmill.

  • Power distribution with surge protection & UPS.

  • Cable trays overhead for sensors & cables.

  • Network switch in rack; wired connections for syncing/displays.


🧪 Commissioning / Calibration Checklist

Here’s what to test once everything is installed. Use this script to validate performance.

TestGoal / MetricPass / Fail Criteria
Power & SafetyE-stop works; UPS holds load; treadmill power on/off works cleanlyAll emergency stops cut power; no voltage drop; UPS supports shutdown safely
Visual LatencyMotion-to-photon (head motion → display reaction) ≤ 20 msUse a latency test-tool and slow-motion / high-fps camera to measure; adjust frames & drivers
Resolution & ClarityPPD (Pixels per degree) realistic, sharp visuals; minimal aliasingHeadset displays and light-field panels look clean at typical viewing distance
Tracking AccuracyHand & head tracking drift < ~1 cm over 5 min; cameras alignedWalk/gait test; reach test; calibrate sensors
HapticsUltrasound focal points feel “touch” without pain; safe exposureRun sample apps; measure pressure per vendor guidelines; ensure safety cutoffs
LocomotionTreadmill responds to walking; acceleration & deceleration smooth; stop latency < ~300 msWalk in place; test emergency stop
AudioSpatialization works; sound sources localize correctly; RT60 < 300 msPlace sound at directions; test ambient rejection; do audio calibration
Environment CuesTemperature/fans/scent sync with scene; no distracting leaks / delaysTrigger cues & observe; adjust delays and fan speeds
Multi-User SyncTwo or more users see consistent world; drift/sync error < ~10 msRun shared scene tests; compare visuals/audio from both users