Guide

Leading prototyping tools for augmented reality and virtual reality projects

Published
December 16, 2025
Share article

You cannot hand someone a Figma link to experience VR. (What does “experience VR” actually mean? It means the prototype has to exist and be tested in VR.) The medium demands the prototype exist in the medium. A flat mockup of a virtual world is like a blueprint of a roller coaster: technically accurate but missing everything that matters. (So what is missing, exactly? Presence, spatial relationships, and physical interactions.) The spatial relationships, the sense of presence, the physical interactions that define immersive experiences cannot be conveyed in two dimensions.

Last month I reviewed an AR prototype built in a 2D tool. The team drew overlays on top of camera feed screenshots. Rectangles represented floating information panels. Arrows indicated where elements would appear relative to real-world objects. (Can a 2D overlay predict 3D behavior? Not reliably, and that mismatch is the point.) When they built the actual AR experience, nothing matched. Spatial relationships that looked fine in 2D were unusable in 3D. Elements that seemed well-positioned on screen floated awkwardly in space. They wasted two months on a prototype that taught them nothing because the prototype could not represent what actually mattered.

Here is the thesis: AR and VR prototyping requires spatial tools, not screen tools. (Is that too absolute? It is meant to be direct, because flat mockups create false confidence.) Flat mockups of immersive experiences are not just inadequate, they are misleading. They create false confidence that delays discovery of real problems until development, when fixing them is expensive.

Why Immersive Prototyping Is Fundamentally Different

Screen-based interfaces have fixed dimensions. You know the viewport size. You control the viewing angle. Users interact through touch or mouse at predictable distances. The design space is constrained in ways that make 2D prototyping effective.

AR and VR break all of these assumptions. Users move through space. Viewing angles are infinite, and what looks good from one angle may be invisible from another. Interaction distances vary: a button that works at arm's length fails when users stand farther away. Content must respond to real-world geometry, anchoring to surfaces and objects in unpredictable environments. (Which assumption breaks first? Usually the one you thought was “fixed,” like angle or distance.) These variables make 2D prototyping nearly useless for meaningful validation.

This is what I mean by spatial design constraints. (Do you need a simpler definition? It is the physics and context that shape what works in 3D.) The basic gist is this: immersive experiences have physics that screen experiences do not, and your prototyping tools must account for those physics. You cannot prototype a 3D experience in 2D any more than you can prototype a symphony by writing about music.

flowchart TD
    A[Immersive Prototype] --> B{Environmental Context}
    B --> C[User Position]
    B --> D[Viewing Angle]
    B --> E[Real-world Geometry]
    B --> F[Lighting Conditions]
    C --> G[Interaction Distance]
    D --> H[Content Legibility]
    E --> I[Spatial Anchoring]
    F --> J[Visibility Thresholds]
    G --> K[Usability Assessment]
    H --> K
    I --> K
    J --> K


Tools for VR Prototyping

The VR prototyping landscape has matured significantly, with options ranging from game engines to specialized design tools.

Unity is the most common VR development platform, and it works for prototyping too. (Do you want prototype-to-production continuity? Unity gives you that.) The learning curve is steep compared to design tools, but Unity prototypes translate directly to production. What you build in Unity is what you ship. For teams with technical resources, this continuity eliminates the translation gap between prototype and final product. Unity supports all major VR headsets and provides extensive documentation.

Unreal Engine offers higher visual fidelity than Unity. (Is visual fidelity the priority? Unreal is built for that kind of realism.) For experiences where realism matters, where users need to feel genuinely present, Unreal's rendering quality justifies the additional complexity. Architectural visualization, automotive design review, and cinematic VR often prefer Unreal. The tradeoff is a steeper learning curve and heavier hardware requirements.

Gravity Sketch allows designers to create in VR. You wear a headset and sculpt experiences with motion controllers. This is the closest tool to sketching in the target medium. (Does it feel like “drawing in space”? Yes, that is the point.) Designers who think spatially find Gravity Sketch intuitive: you reach out and shape the world around you. The output can be exported to Unity or Unreal for further development.

ShapesXR focuses on collaborative VR prototyping. Multiple team members can enter the same virtual space and co-design in real time, regardless of physical location. For distributed teams, this recreates the whiteboard experience in VR. You can sketch ideas, place objects, and discuss designs while standing in the prototype together.

Adobe Aero bridges traditional design tools and AR. You can import assets from Photoshop or Illustrator and add AR behaviors without code. For designers already in Adobe's ecosystem, Aero provides a gentle on-ramp to immersive prototyping.

Tools for AR Prototyping

AR prototyping has different requirements than VR because AR must respond to the real world. The prototype must work across varied environments, lighting conditions, and surface types.

Reality Composer from Apple is the fastest path to iOS AR prototypes. It handles object placement, animations, and basic interactions without code. You can build a prototype in hours and test it on any recent iPhone. For iOS-targeted experiences, Reality Composer provides the tightest loop between design and device testing.

Spark AR Studio from Meta focuses on social AR: filters, effects, and experiences that live on Instagram or Facebook. If your AR project will reach users through Meta's platforms, Spark AR provides native tools and direct publishing. The platform has powerful face tracking and world effect capabilities.

Lens Studio from Snap is the equivalent for Snapchat. The platform has mature template systems and a large creator community. Lens Studio excels at face and body tracking effects and provides sophisticated animation tools.

8th Wall enables web-based AR that works across devices without app installation. Prototypes built in 8th Wall can be shared via URL, dramatically reducing friction for testing with users. Instead of asking testers to download an app, you send a link. Web-based AR has performance limitations compared to native, but the accessibility tradeoff often favors web for prototyping.

Prototyping Without a Headset

Not everyone has VR hardware. Not every team member can wear a headset. Tools exist for approximating immersive prototypes on standard screens, though they sacrifice the core value of spatial testing.

Figma with plugins can mock up VR interfaces, though testing requires imagination. This works for UI elements that will appear in VR: menus, panels, HUD elements. It does not work for spatial layouts or navigation. (Is this still worth doing? Yes, for the 2D interface elements that will live inside VR.) Use Figma for what it does well and acknowledge its limitations for immersive design.

SketchBox renders 3D prototypes viewable on any device. You lose true immersion but gain accessibility. Team members can review spatial designs without hardware, providing feedback on concepts before committing to in-headset testing.

Video prototyping works surprisingly well for concept validation. Record yourself interacting with paper prototypes in physical space, narrating the intended AR overlays. This low-fidelity approach catches spatial problems early. "The information panel would appear here, floating at eye level about two feet in front of you." Combined with pointing and gesturing, video conveys spatial intent well enough for initial feedback.

Testing Immersive Prototypes Effectively

In-headset testing is mandatory for VR. Screen recordings miss the core experience: presence, spatial awareness, motion comfort. A VR prototype viewed on a monitor is like a photograph of food: it conveys information but not experience. Budget time and equipment for actual headset testing.

Comfort metrics matter uniquely in VR. Does the experience cause motion sickness? Are text elements legible at the distances users naturally stand? Do interaction targets feel reachable without awkward stretching? These cannot be evaluated from 2D representations. VR discomfort is subtle: users may not articulate it but will simply stop using the experience.

Test duration matters. Short test sessions miss fatigue effects. If users wear the headset for five minutes, they might rate the experience highly. At thirty minutes, neck strain and eye fatigue emerge. (How long should tests run? Long enough to match intended usage patterns.) Match your test duration to intended usage patterns.

Contextual testing matters for AR. If your AR experience guides users through a warehouse, test in a warehouse. Lighting, occlusion, and spatial anchoring behave differently in different environments. An AR prototype that works perfectly in your office may fail in the target deployment environment.

Test with users who vary in VR/AR experience. Novice users reveal onboarding problems that experienced users skip past. They fumble with controllers, get disoriented, and need clearer guidance. Experienced users reveal depth problems: they push the experience harder and find its limits.

Common Mistakes in Immersive Prototyping

The first mistake is ignoring scale. Objects that look fine in 2D mockups might be too small to read or too large to feel natural in 3D space. A button that is 50 pixels wide means nothing in VR; what matters is its apparent size at interaction distance. Prototype at actual scale and test scale with real users.

The second mistake is over-designing the first-time experience. Immersive interfaces have learning curves. Your prototype should account for how users improve over sessions, not just their first encounter. First impressions matter, but so does long-term usability. Test repeatedly with the same users to observe learning.

The third mistake is forgetting physical constraints. VR users get tired holding their arms up. They bump into real-world obstacles. They need to sit down eventually. AR users cannot focus on content while walking safely. Your prototype should respect human ergonomics and real-world safety requirements.

The fourth mistake is prototyping in isolation. Immersive experiences often supplement traditional interfaces. A VR training experience might live alongside web-based coursework. An AR shopping experience might connect to a mobile commerce app. Ensure your immersive prototype connects to your broader product design system.

The fifth mistake is assuming VR equals immersion. Presence is not guaranteed by the medium. A poorly designed VR experience can feel less immersive than a well-designed 2D one. Your prototype should actively create presence through design, not assume the hardware provides it.

Connecting Immersive and Traditional Design

Many AR/VR experiences include 2D interface elements. Menus, settings, onboarding screens. These should match your product's visual language, even in an immersive context. A jarring style shift between VR menus and your web app undermines brand coherence.

Tools like Figr help here by generating 2D components that respect your design system. When you need HUD elements or panel interfaces for your VR experience, consistency with your broader product builds user confidence. The 2D elements in your immersive prototype should feel like they belong to the same product family.

Design handoffs for immersive experiences require spatial documentation. Traditional specs show pixel dimensions and color values. Immersive specs must include depth positions, angular sizes, and spatial relationships. Document how elements relate to the user's head position, not just to each other.

In short, immersive prototyping requires specialized tools, but those prototypes should still connect to your unified design language.

The Future of Immersive Prototyping

The tooling gap between immersive and traditional design is closing. More designers can prototype for AR/VR without engineering support. Hardware is becoming more accessible. The skills learned prototyping for traditional interfaces increasingly transfer to spatial design.

Teams investing in immersive prototyping skills now will have advantages as AR and VR move from novelty to mainstream. The fundamentals of user-centered design apply: understand user needs, prototype early, test often, iterate based on evidence. The medium changes but the method remains.

The Takeaway

AR and VR prototyping demands spatial tools. Flat mockups mislead more than they inform, creating false confidence that delays real learning. Invest in platforms that let you prototype in the target medium, test with actual hardware when possible, and respect the unique physics of immersive experiences. Connect immersive elements to your broader design system for coherence. The skills you build prototyping for immersive experiences will become increasingly valuable as these technologies mature.