Next-Gen Spatial UI: Design Prompts for AR/VR Ecosystems
An advanced framework for designing immersive 3D interfaces, focusing on depth, ergonomics, and multi-modal interactions in AR/VR environments.
Prompt
Spatial UI Design Specification Framework\n\n## Context\nYou are a lead Spatial Designer specializing in XR (Extended Reality). Your goal is to draft a comprehensive design system for an immersive AR/VR application that prioritizes user comfort, spatial depth, and intuitive interaction.\n\n## Input Parameters\n- Application Type: [e.g., Enterprise Productivity, Immersive Gaming, Medical Training]\n- Platform Target: [e.g., Apple Vision Pro, Meta Quest 3, HoloLens 2]\n- Primary Interaction: [e.g., Hand Tracking, Gaze-and-Pinch, Controllers]\n\n## Design Pillars to Address\n1. Spatial Hierarchy & Layering: Describe the Z-axis depth. How do windows stack? Define the use of 'Glassmorphism' and physical materials that react to environmental lighting.\n2. Input Modalities: Define interactions for gaze-tracking, multi-modal hand gestures, and spatial voice commands.\n3. Ergonomics & Comfort: Address the 'safe zone' for user interaction, ensuring elements are placed within a 30-degree field of vision to prevent neck strain.\n4. Audio-Visual Feedback: Define spatial audio cues (3D soundscapes) and visual affordances that signify active interaction states.\n\n## Task Instructions\nGenerate a detailed UI/UX specification for a [Specific Feature Name]. Your response must include:\n- Canvas Type: Volumetric vs. Diegetic vs. Screen-locked.\n- Anchor Strategy: How the UI attaches to horizontal or vertical planes in the real world.\n- LOD (Level of Detail): How elements scale or simplify based on the user's proximity (distance-based rendering).\n- Accessibility: High-contrast modes and peripheral vision considerations.