Friday, September 12, 2025

Build a Holosuite Step by Step

 

what we’re building (reality check)

A real holosuite is a room-scale, multi-user, multi-sensory environment that blends:

  • visuals (near-eye now; light-field walls later),

  • natural locomotion (omni-floor/treadmill),

  • mid-air haptics + environmental cues (wind, heat, scent),

  • spatial audio,

  • ultra-low latency tracking & rendering,

  • a content pipeline (fast world capture + simulation).

success hinges on latency (<~20 ms motion-to-photon), correct depth cues, and convincing multisensory alignment. PMC+1


bill of materials (bom)

A) visuals

Path A (now): 2–4 high-end MR/VR headsets (eye-tracked, 90–120 Hz).
Path B (upgrade): tileable light-field displays (group-view 3D; up to ~100 views) for “helmet-off” shared scenes; keep headsets for close work. lookingglassfactory.com+2lookingglassfactory.com+2

B) locomotion

  • Omnidirectional floor/treadmill. Disney’s HoloTile proves multi-user omni-floor feasibility (research/demonstrator). Commercial alt: curved-shoe or ring-rail omni treadmills. YouTube

C) haptics & atmosphere

  • Mid-air ultrasound haptics module(s) for touchable buttons/surfaces in free air (Ultraleap dev units). Ultraleap Documentation+1

  • Optional acoustic “hologram” array for levitation/advanced effects (lab-grade research kit only). Nature+1

  • Wearable vibro/force bands; fans, heaters, cool mist; scent micro-emitters.

D) spatial audio

  • Beamformed speaker array or high-order ambisonics with individualized HRTFs; sync to head/eye pose. (Vision-class spatial audio is a good reference target.) YouTube

E) tracking & compute

  • Ceiling/floor depth cams + IMUs (OpenXR/SteamVR tracking), fast steering if using projection.

  • Compute: 1–2 multi-GPU boxes (per 2 users) to keep motion-to-photon under ~20 ms. PMC

F) content pipeline

  • 3D Gaussian Splatting toolchain (phone/drone capture → splat training → real-time render). Use the paper + official repo. GitHub+3arXiv+3ACM Digital Library+3

  • Game engine: Unity/Unreal + OpenXR.

  • AI agents (NPCs), physics, interaction system.

G) room shell & safety

  • Acoustic treatment (RT60 ≲ 300 ms), blackout, HVAC, UPS, E-stop buttons, soft walls/rails near locomotion.

  • Clear exposure limits for ultrasound, safe projector/laser classes, and scent allergen policy. (Follow device safety docs.) Ultraleap Documentation


floor plan (8–16 m²)

  • Front wall: (future) light-field tiles; behind-wall cable trough.

  • Ceiling: depth cams, audio array, light bar; vented plenum.

  • Center: omni-floor/treadmill with safety rail & E-stop.

  • Perimeter: ultrasound haptics towers at ~waist/shoulder height.

  • Rack: compute/UPS/network + scent/wind modules; separate “quiet corner” for compressors.


core equations (targets & tuning)

  • motion-to-photon latency
    ttotal=ttracking+trender+tscanout+tdisplay    <20 mst_{\rm total}=t_{\rm tracking}+t_{\rm render}+t_{\rm scanout}+t_{\rm display}\;\rightarrow\;<20\text{ ms} for comfort. PMC

  • light-field sampling (group 3D walls): aim ≥60 pixels/deg and dozens of views to reduce aliasing & VAC; commercial panels offer up to ~100 views. lookingglassfactory.com+1

  • mid-air ultrasound pressure (qualitative): radiation pressure F2αIcF \propto \frac{2\alpha I}{c}. Use vendor tools to keep within skin-safe limits. Ultraleap Documentation


step-by-step build (phased)

phase 0 — room prep (1–2 weeks)

  1. Acoustic panels (walls/ceiling), blackout curtains, dedicated circuits, 20A outlets.

  2. Network (Cat6/6a), ceiling mounts for sensors, cable trays.

  3. Install UPS + surge, set E-stop box reachable from treadmill center.

phase 1 — core immersion (2–4 weeks)

  1. Headsets + base stations/inside-out tracking; calibrate guardian/chaperone.

  2. Compute: set up OpenXR; measure MTP latency with test tools; tune to ≤20 ms. PMC

  3. Spatial audio: set speakers/ambisonics; load personalized HRTFs where possible. YouTube

  4. Environmental cues: add fans/heat; tie to engine events (e.g., wind zones).

phase 2 — mid-air haptics (2–3 weeks)

  1. Install Ultraleap haptics unit(s) at waist height; USB/serial to host.

  2. Calibrate focal points to tracked hands; render “virtual buttons,” surfaces, and textures; gate intensity with safety API. Ultraleap Documentation

phase 3 — locomotion (4–8 weeks)

  1. Install omni-treadmill (or preorder omni-floor when available).

  2. Integrate control: keep user near room center while world movement matches gait; verify stopping within <300 ms & add fall-prevention rail. (HoloTile demos show multi-user feasibility.)

phase 4 — group view walls (pilot)

  1. Mount light-field display panels; set viewing zone.

  2. Sync engine cameras to panels for spectators/users without headsets (100-view tech enables multi-viewer 3D). lookingglassfactory.com+1

phase 5 — content pipeline (ongoing)

  1. Capture worlds: shoot a space with a phone/drone; process into 3D Gaussian splats (minutes) and stream in-engine at 60–200 FPS. arXiv+1

  2. Add NPCs (LLM + behavior trees), physics, interactions; bake spatial audio paths.

  3. Build a library of “experiences” (scenes + haptics + atmosphere + locomotion scripts).

phase 6 — QA & comfort

  • Iterate until: MTP ≤ 20 ms, PPD ≥ 35–60, RT60 ≲ 300 ms, haptic patterns are vivid but within vendor exposure limits. PMC+1


costs (very rough 2025 ranges)

  • Starter (2 users, headsets only): $8k–$20k (HMDs, PC, audio, fans, build-out).

  • With mid-air haptics & treadmill: $25k–$80k.

  • With light-field wall + multi-user locomotion: $120k–$500k (panels + pro integration).


what to build first (minimal viable holosuite)

  • Headsets + spatial audio + fans/heat (Phase 1)

  • One Ultraleap haptics array (Phase 2) Ultraleap Documentation

  • A safe omni-treadmill (Phase 3)

  • Content via Gaussian splats (Phase 5). arXiv

This already feels shockingly close to a “holosuite.” Then add light-field walls later for the “take the helmet off” moment. lookingglassfactory.com


references worth skimming



  1. Bill of Materials (BOM) checklist with categories and suggested components.

  2. Room wiring + layout diagram (text description).

  3. Commissioning script: step-by-step calibration & testing once everything is installed.


🛠️ 1. Bill of Materials (BOM)

A) Visuals

  • Now:

    • 2–4 × High-end MR/VR headsets (eye tracking, 90–120 Hz) → e.g., Apple Vision Pro, Varjo XR-4, or Meta Quest Pro.

    • Tracking base stations (if not inside-out).

  • Future upgrade:

    • 2–3 × Light-field display panels (e.g., Looking Glass 65" class or equivalent multi-view tiled systems).

B) Locomotion

  • 1 × Omnidirectional treadmill or equivalent (Virtuix Omni, Kat Walk, Infinadeck).

  • Safety rail, harness, and E-stop.

  • Future: Disney HoloTile-like omni-floor (when available).

C) Haptics & Atmosphere

  • 1–2 × Ultraleap STRATOS Explore or equivalent mid-air ultrasound haptics dev kit.

  • 2 × Wearable haptic vests/gloves (bHaptics, TESLASUIT lite).

  • 3 × Floor fans (PWM controllable).

  • 2 × IR heat lamps (dimmable).

  • 1 × Ultrasonic humidifier (for mist).

  • 1 × Scent diffusion kit (programmable cartridges).

D) Audio

  • 8–12 × Compact full-range speakers (arranged hemispherically).

  • 2 × Subwoofers (floor mounted).

  • Audio interface with ambisonic decoding (Focusrite, RME).

E) Tracking & Compute

  • 4 × Ceiling-mounted depth cameras (Azure Kinect, Intel RealSense, or OptiTrack).

  • IMUs in headsets and treadmill.

  • Compute server:

    • 2 × Nvidia RTX 5090 (or latest).

    • 128 GB RAM.

    • PCIe Gen5 SSD (8 TB).

    • UPS + surge protection.

F) Content Pipeline

  • Unity/Unreal engine (OpenXR support).

  • Gaussian Splatting toolkit (repo + CUDA build).

  • NPC AI stack: LLM + behavior tree system.

G) Room & Safety

  • 12–16 m² enclosed space.

  • Acoustic wall panels (RT60 ≲ 300 ms).

  • Cable trays, blackout curtains.

  • HVAC upgrade (extra 3–5 kW cooling).

  • Emergency lighting + E-stop buttons on walls and treadmill.


📐 2. Room Wiring & Layout (text diagram)

[Front Wall] ├── Light-field panel array (future) ├── Speaker cluster (L, R, Center, Height) └── Cable trough behind wall [Ceiling] ├── 4 × depth cameras (corners, angled inward) ├── 6 × compact speakers (spatial array) ├── Light bar + IR markers └── Ventilation plenum [Center] ├── Omni-treadmill (safety rail + harness) ├── Haptics towers (ultrasound arrays waist-high) └── Overhead harness anchor (optional) [Perimeter] ├── Fans & IR lamps (N, S, E, W sides) ├── Mist + scent emitters near floor vents └── Subwoofers in front/rear corners [Rack Corner] ├── Compute server + UPS ├── Audio interface & amplifiers └── Network switch + cooling

⚙️ 3. Commissioning Script

Step 1: Power & Safety

  • Verify dedicated 20A circuits for compute + treadmill.

  • Connect all gear to UPS → check runtime ≈ 10–15 minutes.

  • Test E-stop buttons → treadmill, power relays shut down.

Step 2: Tracking Calibration

  • Mount ceiling depth cams → measure overlap fields.

  • Run SLAM calibration → sync with treadmill’s IMU.

  • Align headset coordinate system to treadmill center (offset < 1 cm).

Step 3: Visuals

  • Connect headsets → run OpenXR diagnostics.

  • Measure motion-to-photon latency (goal ≤ 20 ms).

  • Foveated rendering enabled; confirm frame rates (90–120 Hz).

Step 4: Audio

  • Place ambisonic mic at treadmill center.

  • Run calibration sweep → tune EQ/delays.

  • Verify localization error < 5°.

Step 5: Haptics

  • Position ultrasound arrays ~80 cm from user.

  • Run SDK focal point test (buttons, sliders).

  • Ensure skin pressure within vendor safety (100–300 mN fingertips).

Step 6: Atmosphere

  • Integrate fans/heat/scent into engine.

  • Run scripted cues (wind gust, campfire heat, rainforest humidity).

Step 7: Locomotion

  • Enable treadmill → tune control loop.

  • Test walking speeds 0.5–2 m/s; verify braking < 300 ms.

  • Confirm center error < 0.5 m over 5 minutes.

Step 8: Content Pipeline

  • Capture a room via smartphone drone → train Gaussian Splat scene.

  • Import into engine; verify 60–200 FPS render.

  • Spawn AI NPC; confirm speech + behavior tree runs in sync with world.

Step 9: Multi-user Sync

  • Add 2nd headset; check frame sync < 10 ms drift.

  • Validate audio/visual coherence for both users.


✅ Minimum Viable Holosuite (first build target)

  • 1 treadmill + 2 headsets.

  • Spatial audio + fans/heat.

  • 1 ultrasound haptics array.

  • Gaussian Splat scene running at ≥90 Hz.

This alone gives you the “wow” factor. Then add light-field walls + multi-user locomotion for full Trek-style immersion.


 Holosuite Shopping List with Links (BOM + Purchases)

Here are components you can buy right now (or preorder) to build the holosuite.

Meta Quest Pro VR Headset
balanced all‑rounder
Meta Quest Pro VR Headset
$849.00
Walmart + others
3.9
Pimax Crystal Light VR Headset
retina clarity
Pimax Crystal Light VR Headset
$599.00
pimax store
3.6
KAT Walk C2+ VR Treadmill
locomotion‑floor
KAT Walk C2+ VR Treadmill
$1,499.00
KATVR + others
3.7
Fit Immersion PRO‑2X VR Headset
enterprise spec
Fit Immersion PRO‑2X VR Headset
$650.00
North Pole Engineering
KAT WALK C2 Core Treadmill
budget omni‑treadmill
KAT WALK C2 Core Treadmill
$999.00
Knoxlabs
4.1
Virtuix Omni VR Treadmill Full Set
omni full system
Virtuix Omni VR Treadmill Full Set
$1,850.00
eBay
HTC Vive Flow Headset
compact form factor
HTC Vive Flow Headset
$399.95
B&H Used Store
3.5
Pimax Crystal Super VR Headset
super high PPD
Pimax Crystal Super VR Headset
$799.00
pimax store
ChatGPT chooses products independently. Learn more

Here are some top picks:

  • Meta Quest Pro VR Headset — one strong mixed-reality/VR headset with passthrough and good optics. Great for initial visual immersion.

  • Pimax Crystal Light VR Headset — very high clarity (PPD) headset; good if visual fidelity is a priority.

  • KAT Walk C2+ VR Treadmill — a quality omni-directional treadmill for walking in VR with safety rails.

  • KAT WALK C2 Core VR Treadmill — simpler / more compact system for locomotion.

And from the non-product pages:

  • Light-Field Display, 16-inch by Looking Glass (~US$3,000) for group/shared 3D visuals without headsets. lookingglassfactory.com

  • Light Field Display, 27-inch, pro version for workplace / larger immersive display. lookingglassfactory.com+1

  • Ultraleap STRATOS Explore Development Kit (mid-air haptics) to get tactile feedback in the suite. RobotShop USA+1

  • Ultraleap Hand Tracking Kit for Pico Neo 3 Pro (Stereo IR170) — for hand tracking improvements. unboundxr.com


📐 Room Wiring & Layout Diagram (Text + Suggested Layout)

Here’s a sketch / description you can convert into a diagram:

┌───────────────────────────────┐ │ Light-Field Panel(s) │ ← Front Wall (if used)Speakers + Display Array │ └───────────────────────────────┘ Ceiling: Depth Cameras (4 corners), Ceiling Speakers, Ventilation ┌────────── Treadmill ─────────┐ │ Center Zone │ │ Omni-treadmill + Safety │ │ Harness / Rail / E-Stop │ └──────────────────────────────┘ Walls: Haptics Towers at waist/shoulder height along sides Back Wall or Side: Compute Rack (GPUs, UPS), Audio Interface, Cabling Floor: Fan/Heat Lamps, Scent Emitters under vents Perimeter Speakers, Subwoofers in corners

Electrical layout:

  • Dedicated high-capacity outlets (20-30 Amp) for compute rack and treadmill.

  • Power distribution with surge protection & UPS.

  • Cable trays overhead for sensors & cables.

  • Network switch in rack; wired connections for syncing/displays.


🧪 Commissioning / Calibration Checklist

Here’s what to test once everything is installed. Use this script to validate performance.

TestGoal / MetricPass / Fail Criteria
Power & SafetyE-stop works; UPS holds load; treadmill power on/off works cleanlyAll emergency stops cut power; no voltage drop; UPS supports shutdown safely
Visual LatencyMotion-to-photon (head motion → display reaction) ≤ 20 msUse a latency test-tool and slow-motion / high-fps camera to measure; adjust frames & drivers
Resolution & ClarityPPD (Pixels per degree) realistic, sharp visuals; minimal aliasingHeadset displays and light-field panels look clean at typical viewing distance
Tracking AccuracyHand & head tracking drift < ~1 cm over 5 min; cameras alignedWalk/gait test; reach test; calibrate sensors
HapticsUltrasound focal points feel “touch” without pain; safe exposureRun sample apps; measure pressure per vendor guidelines; ensure safety cutoffs
LocomotionTreadmill responds to walking; acceleration & deceleration smooth; stop latency < ~300 msWalk in place; test emergency stop
AudioSpatialization works; sound sources localize correctly; RT60 < 300 msPlace sound at directions; test ambient rejection; do audio calibration
Environment CuesTemperature/fans/scent sync with scene; no distracting leaks / delaysTrigger cues & observe; adjust delays and fan speeds
Multi-User SyncTwo or more users see consistent world; drift/sync error < ~10 msRun shared scene tests; compare visuals/audio from both users

Building the Holosuite: The Science and Technology Behind Real Immersive Worlds

 

what a real holosuite actually is (today)

It’s not one magic display. It’s an integrated stack:

  1. Visual immersion

    • Near-eye headsets for now (e.g., retina-class, eye-tracked, low-latency MR/VR) and, later, room-scale light-field/holographic walls.

    • Key problems to beat: latency (<20 ms motion-to-photon) and vergence–accommodation conflict (VAC) (the eye focuses at a fixed screen but converges at virtual depths). Apple Vision Pro shows the state of high-end near-eye displays & spatial audio; VAC reduction requires true light-field/holographic displays or varifocal systems. ScienceDirect+4PMC+4ScienceDirect+4

    • Group-viewable light-field panels already exist (dozens of views, no glasses) and can tile into walls. lookingglassfactory.com

  2. Locomotion

    • Omnidirectional floors let you walk “anywhere” in a small room; Disney’s HoloTile demoed a multi-user version (research stage). YouTube+1

  3. Touch / haptics

    • Mid-air ultrasound haptics focuses acoustic pressure to your skin (contactless buttons, shapes, gusts). Mature research & products exist; acoustic phased arrays can even levitate/steer tiny objects (“acoustic holograms”). Bruce Drinkwater+4ResearchGate+4support.ultraleap.com+4

    • Complement with body-worn haptics/exosleeves for force & weight illusions (commercial gear exists), plus fans/heat/cold for environmental cues.

  4. Spatial audio

    • High-order ambisonics or beamformed arrays + personalized HRTFs for believable distance/elevation; Vision Pro-style audio ray tracing shows state of the art. Apple

  5. Scent & atmosphere

    • Controlled micro-dosing scent emitters; wind, temperature, humidity for presence.

  6. World capture & rendering

    • 3D Gaussian Splatting has become the “JPEG moment for spatial computing”: fast, photoreal scene capture & realtime render from videos/phones—ideal for rapidly populating holosuite worlds. (NeRF successor; real-time with high fidelity.) The Verge+3arXiv+3repo-sam.inria.fr+3

  7. AI actors & simulation

    • On-device/edge LLMs + behavior trees for NPCs; physics for objects; safety guardian.


the core math you’ll actually use

1) display & optics (light-field / holography)

  • Light-field sampling (spatio-angular Nyquist): to avoid aliasing, sample spatial pitch Δx\Delta x and angular pitch Δθ\Delta \theta so that scene spatial frequency fxf_x and disparity remain under Nyquist. Practical rule: pixels per degree (PPD) ≥ 60 and views ≥ 32–100 for multi-viewer walls; otherwise VAC & swim occur. (Sampling analyses from diffraction/light-field literature.) MDPI+1

  • Holography (Fourier optics): fringe spacing dλ2sin(θ/2)d \approx \frac{\lambda}{2\sin(\theta/2)}. To steer angle θ\theta, SLM pixel pitch pp bounds max angle: θmaxsin1 ⁣(λp)\theta_{\max} \approx \sin^{-1}\!\left(\frac{\lambda}{p}\right).

  • VAC mitigation target: provide true focus cues; varifocal: dynamically set focal distance f(t)f(t) to vergence depth; light-field/holography: render correct wavefront for accommodation at depth zz. VAC is what makes long sessions uncomfortable. PMC+1

  • Latency budget: motion-to-photon <20ms< 20\,\mathrm{ms} (preferably < 12 ms) to avoid motion sickness:

    ttotal=thead track+trender+tscanout+tdisplayt_{\mathrm{total}} = t_{\mathrm{head\ track}} + t_{\mathrm{render}} + t_{\mathrm{scanout}} + t_{\mathrm{display}}

2) locomotion (omni floor)

  • Control: closed-loop body tracking yields desired floor velocity vf\mathbf{v}_f keeping user near room center while making world-space motion vw\mathbf{v}_w feel natural. Basic law:

    vf=G(puser)vw,\mathbf{v}_f = G\left(\mathbf{p}_\text{user}\right) - \mathbf{v}_w,

    with stability via PID/LQR on user position. (Disney HoloTile proves feasibility.) YouTube

3) mid-air haptics (ultrasound)

  • Acoustic radiation pressure at a focal point (simplified):

    F2αIc,F \propto \frac{2\alpha I}{c},

    where II is intensity, cc sound speed, α\alpha depends on medium/skin. A phased array solves per-transducer phase/amplitude to maximize focal pressure subject to safety. (Ultraleap describes the control-point solver.) support.ultraleap.com

  • Levitation / “acoustic holograms”: optimize array phases ϕi\phi_i so the superposed field yields target pressure nodes; Bristol’s work shows single-sided levitation and manipulation. University of Bristol

4) spatial audio

  • Ambisonics order NN sets spatial resolution; channel count (N+1)2(N+1)^2. Real-time binaural rendering uses listener pose to rotate the soundfield; time-of-flight & occlusion from geometry yield audio ray tracing.

5) real-time world capture

  • 3D Gaussian Splatting objective (very high level): fit Gaussian set {Gk(μk,Σk,ck)}\{ \mathcal{G}_k(\mu_k,\Sigma_k,\mathbf{c}_k) \} to minimize photometric error across views while enforcing visibility; rendering is alpha-composited splats sorted by depth. It trains in minutes and renders at 60–200 FPS on a good GPU. arXiv+1


reference architecture (modular holosuite)

Room shell (6–8 m² min):

  • Acoustic treatment, blackout, HVAC, power & thermal headroom (~1–2 kW per user for GPUs/displays/actuators).

Sensing:

  • Ceiling/floor depth cams + IMUs (inside-out tracking), eye tracking (for foveated rendering), SLAM.

Visual layer (choose path):

  • Path A (near-term): high-end MR/VR headsets (e.g., Vision Pro class) for each user + projection surfaces for peripheral ambience. PMC

  • Path B (mid-term): tileable light-field panels (e.g., Looking Glass-style) for group no-glasses 3D; add smaller near-eye for close-up tasks. lookingglassfactory.com

Locomotion:

  • Omni floor (HoloTile-like) with center-hold control; fall-prevention & emergency stop. YouTube

Haptics:

  • Ultrasound arrays at waist/desk height + ceiling for touch cues; wearable vibro/force bands for sustained forces. ResearchGate

Audio & atmosphere:

  • Beamformed speaker arrays + sub; scent/wind/heat modules.

Compute:

  • Multi-GPU server (path tracing + Gaussian splats), <12 ms pipeline; real-time physics & AI agents.

Content:

  • Library of Gaussian-splat scenes; photogrammetry; procedural worlds; AI-driven NPCs. arXiv+1


build it in phases (pragmatic roadmap)

Phase 1 — Foundational “room-VR lab” (1–2 months)

  • Headsets + tracking; spatial audio; fan/heat modules; <20 ms motion-to-photon target. Capture a few spaces with 3D Gaussian Splatting to experience instant photoreal worlds. arXiv+1

Phase 2 — Atmosphere & haptics (2–4 months)

  • Add mid-air ultrasound haptics unit for touchable mid-air buttons/textures; integrate with engine events. support.ultraleap.com

Phase 3 — Locomotion (research/procurement)

  • Integrate an omnidirectional floor (or a lower-cost treadmill proxy) with safety rails & E-stop; tune controller to keep user centered while world motion feels natural. (Study HoloTile principles & demos.) YouTube

Phase 4 — Group view walls (pilot)

  • Install light-field panels for “helmet-off” scenes and spectators; sync with headset users so everyone shares one world. lookingglassfactory.com

Phase 5 — Reduce VAC / increase comfort (ongoing)

  • Experiment with varifocal optics or near-eye holography research techniques to lessen VAC symptoms for long sessions. ScienceDirect+1

Phase 6 — Content pipeline

  • Standardize capture via phone rigs or drones → Gaussian splats (minutes to train) → live in holosuite; mix with physically-based rendering & AI NPCs. arXiv+1

Safety & policy

  • Enforce exposure limits (lasers/IR, ultrasound SPL, scent allergens), fall protection, emergency lighting, and accessibility.


performance targets (rules of thumb)

  • Visual: PPD ≥ 35 (min), aim 60+; FOV ≥ 100°; MTP latency ≤ 20 ms; effective multi-view count ≥ 50 for walls.

  • Audio: localization error < 5°; RT60 < 300 ms (treated room).

  • Haptics: focal refresh ≥ 200 Hz; safe skin exposure; perceptible pressure peaks ~100–300 mN on fingertips (device-specific). ResearchGate

  • Locomotion: center error < 0.5 m; braking < 300 ms; fall-probability minimized via predictive control.

  • Render: 90–120 FPS per user eye; Gaussian-splat scenes 100–200 FPS on modern GPUs. arXiv


how it “feels” together

  • Your eyes see correct parallax, occlusions, and (in time) focus cues.

  • Your ears hear sound rays bounce realistically from virtual geometry.

  • Your skin feels air, heat, and mid-air tactile points.

  • Your feet keep walking “through” worlds while staying in one room.

  • Your brain gets the right multisensory correlations at low latency—this is presence.


good research threads to follow

  • Mid-air ultrasound haptics (surveys & tech notes). ResearchGate+1

  • Acoustic levitation & “acoustic holograms.” University of Bristol+1

  • VAC: why VR makes some people ill & how light-fields help. PMC+1

  • Light-field/holographic sampling & diffraction constraints. MDPI+1

  • Vision Pro specs & spatial audio cues (today’s high end). Apple

  • Omnidirectional floors (Disney HoloTile demos). YouTube

  • Gaussian splatting for instant 3D worlds. arXiv+2repo-sam.inria.fr+2


bottom line

A holosuite isn’t a single breakthrough; it’s an orchestra of displays, acoustics, haptics, locomotion, capture, and AI—played with ruthless attention to latency, focus cues, and multisensory alignment. Nearly every subsystem is here now in some form; stitching them into a safe, reliable, multi-user room is the engineering art.

If you want, I can package this into a neat project brief (with a bill-of-materials per phase) or a pitch deck for funding a holosuite pilot.