Glossary of Terms

Core Concepts

Spatial Computing: A human-computer interaction paradigm that blends the digital and physical worlds. It allows machines to understand and interact with the 3D space around us, and for users to interact with digital information as if it were part of their physical environment. Think of it as the internet breaking free from the 2D screen and entering our 3D world. 🌍

Extended Reality (XR): An umbrella term that encompasses all real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables. It includes Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR).

Mixed Reality (MR): An advanced form of XR where digital and physical objects co-exist and interact in real-time. MR experiences are anchored to the real world, meaning a virtual object can be placed on a real table and will stay there as you walk around it. This is the foundation of true spatial computing.

Augmented Reality (AR): The overlaying of computer-generated images and information onto the real world, typically viewed through a device like a smartphone camera or smart glasses. AR enhances our perception of reality but doesn't usually allow for deep interaction between the digital and physical. Think of Pokémon GO or a filter on Instagram.

Virtual Reality (VR): A completely immersive, computer-generated simulation of a 3D environment that users can interact with in a seemingly real way. VR completely replaces the user's real-world surroundings with a digital one, typically experienced through a fully enclosed headset.

c. Avi Bar-Zeev

Hardware & Devices

Head-Mounted Display (HMD): The general term for any display device worn on the head. Both VR/XR headsets and MR/AR glasses are types of HMDs.

Smart Glasses: A wearable computing device in the form of eyeglasses. Early versions focused on displaying notifications and simple information in the user's field of view.

Video See-through Headsets: These are fully opaque headsets, like a traditional VR headset, that use a set of external cameras to capture a live video feed of the real world. This video is then displayed on the screens inside the headset, and digital objects are rendered on top of it. This method is also commonly known as Passthrough AR or Mixed Reality (MR).

  • How it Works: You are not looking directly at the world; you are looking at a screen that is showing you the world.

  • Strengths: Allows for a seamless blend between full VR and MR. Because the device controls every pixel you see, it can make virtual objects look completely solid and opaque, perfectly blocking out the real world behind them.

  • Examples: Meta Quest 3, Meta Quest Pro, Apple Vision Pro.

Optical See-through Headsets: These devices use transparent lenses or displays that you look through directly, like a normal pair of glasses. Digital information (images, text, models) is projected onto a special layer on the lenses (often a waveguide) and directed into your eye. The digital content appears to be "holographically" overlaid on your direct view of the real world.

  • How it Works: You are looking directly at the world, with digital light layered on top.

  • Strengths: Provides a true, unfiltered view of reality, which is critical for tasks requiring precise real-world interaction. There is no lag or distortion from a camera feed.

  • Examples: Microsoft HoloLens 2, Magic Leap 2.

AR Glasses: This category represents the goal of shrinking optical see-through technology into a lightweight, socially acceptable form factor that looks and feels like a standard pair of eyeglasses. The focus is on providing glanceable, contextual information rather than fully immersive 3D models.

  • How it Works: Same principle as optical see-through, but with miniaturized components to prioritize all-day wearability and subtle aesthetics.

  • Strengths: Designed to be worn for extended periods, providing convenient, heads-up information like notifications, directions, and real-time translation.

  • Examples: Snap Spectacles (the advanced AR versions), Vuzix Blade, and Meta's long-term research codenamed 'Project Orion'. This category represents the ultimate goal for many companies in the space.

AI Glasses: AI Glasses are smart glasses that prioritize artificial intelligence, audio, and camera input over a visual display. While they look like regular glasses, their primary function is to act as a hands-free AI assistant that can see what you see and hear what you hear.

  • How it Works: They use cameras to identify objects and text, microphones for voice commands, and speakers (often open-ear or bone-conduction) to give you information. They typically do not have a display.

  • Strengths: Excellent for capturing moments, live streaming, and getting real-time information from an AI about your surroundings without a screen getting in the way.

  • Examples: Ray-Ban Meta Smart Glasses, Brilliant Labs Frame.

Waveguide: A key piece of optical technology used in many AR and MR glasses. It's a thin, transparent piece of glass or plastic that "guides" light from a micro-display into the user's eye, making the digital image appear overlaid on the real world. This is how smart glasses can look like regular glasses.

See-Through Display: A type of display, often using waveguides, that allows the user to see the physical world through the lens while also viewing digital content. This is essential for AR and MR devices.

Pancake Lens: An advanced, compact optical lens assembly used in modern XR headsets like the Meta Quest 3 and Apple Vision Pro. Unlike older, bulkier Fresnel lenses, pancake lenses fold the light path, allowing for a much thinner and lighter headset design (a smaller gap between the lenses and the display), which improves user comfort.

Micro-OLED: A type of display technology that places OLEDs (Organic Light Emitting Diodes) on a silicon chip. This results in incredibly small pixels, leading to extremely high pixel densities (pixels per inch). Used in devices like the Apple Vision Pro, they produce super-sharp, bright images with deep blacks, eliminating the "screen door effect" seen in older headsets.

Key Market Devices

Note: The XR market evolves rapidly. Release dates and specs are accurate as of September 2025.

Samsung ‘Project Moohan’

  • Launch Price: Unannounced (Speculated to be in the $2,000 - $2,500 range)

  • Category: Video See-through Headset (XR)

  • Description: 'Project Moohan' is the widely reported codename for Samsung's upcoming high-end XR headset. It is being developed in a major partnership with Google (providing the Android XR operating system) and Qualcomm (providing the chipset) and is expected to be a direct competitor to the Apple Vision Pro.

  • Release Date: Unannounced (Expected to be unveiled in late 2025)

Oakley Meta Glasses

  • Launch Price: Starting at $329

  • Category: AI Glasses

  • Description: Part of the same product family as the Ray-Ban Meta glasses, these devices feature the same core Meta AI technology but are built into sporty Oakley frame designs.

  • Release Date: August 2025

Meta Quest 3S

  • Launch Price: $299

  • Category: Video See-through Headset (MR)

  • Description: Officially unveiled at Meta Connect in September 2024, the Meta Quest 3S is designed to replace the Quest 2 as the company's entry-level offering. To achieve its lower price point, it uses the same powerful Snapdragon XR2 Gen 2 chip as the Quest 3 but compromises on other components. It features a lower-resolution LCD display and uses older Fresnel lens technology, making it slightly bulkier than the Quest 3. It retains color passthrough for basic mixed reality but is positioned primarily as a powerful, affordable VR gaming device.

  • Release Date: October 15, 2024

Apple Vision Pro

  • Launch Price: Starting at $3,499

  • Category: Video See-through Headset (Spatial Computer)

  • Description: Apple's high-end mixed reality headset designed for productivity, entertainment, and communication. It features ultra-high-resolution micro-OLED displays, a sophisticated sensor array, and is controlled entirely by the user's eyes, hands, and voice. It runs on visionOS.

  • Release Date: February 2, 2024

Ray-Ban Meta Smart Glasses

  • Launch Price: Starting at $299

  • Category: AI Smart Glasses

  • Description: A collaboration between Meta and EssilorLuxottica, these glasses integrate a camera, open-ear audio, and Meta AI into a classic Ray-Ban form factor. They allow users to take photos/videos, listen to music, and interact with an AI assistant hands-free, but they do not have a visual display for AR overlays.

  • Release Date: October 17, 2023

Meta Quest 3

  • Launch Price: Starting at $499

  • Category: Video See-through Headset (MR)

  • Description: The successor to the immensely popular Quest 2. The Quest 3 is a consumer-focused device that offers a significant leap in mixed reality capabilities thanks to high-quality color passthrough and a built-in depth sensor.5 It uses pancake lenses for a slimmer profile.

  • Release Date: October 10, 2023

Meta Quest Pro

  • Launch Price: $1,499 (Later reduced to $999)

  • Category: Professional / Prosumer Video See-through Headset (MR)

  • Description: A premium headset from Meta aimed at professionals and developers. It was one of the first consumer devices to feature high-quality color passthrough, eye tracking, and face tracking for more realistic social avatars. It pioneered many of the technologies now found in the Quest 3.

  • Release Date: October 25, 2022

Meta Quest 2

  • Launch Price: $299

  • Category: VR Headset

  • Description: The device largely responsible for bringing VR to the mainstream. It is a fully self-contained VR system that offered an accessible price and a vast library of games and apps, making it the best-selling VR headset to date.

  • Release Date: October 13, 2020

Platforms & Operating Systems

visionOS: Apple's spatial operating system that powers the Apple Vision Pro. Built on the foundation of macOS and iOS, visionOS features a three-dimensional user interface that users navigate with their eyes, hands, and voice. Apps can be placed and anchored in the physical space around the user.

Horizon OS: The operating system developed by Meta for its Quest line of headsets (formerly known as the Quest Platform or Quest OS). In 2024, Meta announced it was opening up Horizon OS for third-party hardware manufacturers to build their own headsets, aiming to create a more open ecosystem.

Android XR: An open-source XR platform from Google. Similar to how Android operates on smartphones, Android XR is being developed as a foundational operating system for a wide range of third-party devices. It is the platform powering the upcoming headset from the Samsung-Google-Qualcomm partnership.

Key Technologies & Software

6DoF (Six Degrees of Freedom): Refers to the ability to track movement in 3D space. It includes three rotational axes (pitch, yaw, roll) and three translational axes (moving forward/backward, up/down, left/right). 6DoF tracking is crucial for realistic VR and MR experiences, as it allows you to physically walk around and look around in the virtual or mixed environment.

SLAM (Simultaneous Localization and Mapping): A technology that allows a device to build a map of an unknown environment while simultaneously keeping track of its own location within that map. This is how headsets and glasses understand the layout of your room, identifying floors, walls, and furniture.

Digital Twin: A virtual model of a real-world object, system, or process. In spatial computing, you could have a digital twin of a jet engine that an engineer can inspect and interact with in MR as if the real engine were right in front of them.

World Anchor: A specific point in the physical world that is used to "anchor" digital content. This ensures that a virtual object placed in a room will remain in the same spot, even if you leave the room and come back later.

Volumetric Video: A technique that captures a three-dimensional space in video. Instead of a flat, 2D video, it creates a 3D model that you can walk around and view from any angle, often called a "hologram."

Passthrough: A feature on a VR or MR headset that uses external cameras to show the user a real-time video feed of their physical surroundings. Monochromatic passthrough displays the world in black and white, while Color passthrough shows it in full color. High-quality, low-latency color passthrough is the key technology that enables mixed reality on otherwise opaque headsets.

3D Models & Scene Representation

Gaussian Splatting (GSplats): A cutting-edge technique used to render photorealistic 3D scenes in real-time. Instead of using traditional polygons (meshes), it represents a scene as a collection of millions of tiny, colorful, and semi-transparent particles called Gaussians. This method allows for incredibly fast and realistic rendering of complex environments captured from real-world photos or videos.

Neural Radiance Fields (NeRFs): An artificial intelligence method for creating stunning 3D scenes from a collection of 2D images. A neural network learns how light behaves within a scene, allowing it to generate new, highly detailed views from any angle. While computationally intensive, NeRFs are known for their exceptional photorealism, especially with reflections and transparent objects.

gltf (GL Transmission Format): Often called the "JPEG of 3D," glTF is a standard, open-source file format designed for the efficient transmission and loading of 3D scenes and models. It's optimized for use on the web and in applications, packing geometry, textures, and animations into a compact file, making it perfect for XR experiences.

USDZ (Universal Scene Description Zipped): A 3D file format developed by Apple in collaboration with Pixar, based on Pixar's Universal Scene Description (USD) technology. USDZ is a zipped archive containing all the necessary files (models, textures, animations) in a single package. It's highly optimized for sharing and for quick-loading augmented reality experiences on Apple devices like the iPhone, iPad, and Vision Pro.

User Interaction & Experience

Hand Tracking: The ability for a device to see and understand the position and gestures of a user's hands without needing physical controllers. This allows for natural, intuitive interaction with digital objects using your own hands.

Eye Tracking: Technology that monitors the position and movement of a user's eyes. It can be used as an input method (e.g., selecting an object by looking at it) and for foveated rendering.

Foveated Rendering: An optimization technique that takes advantage of eye tracking. The device renders the part of the scene the user is directly looking at in high resolution, while rendering the peripheral areas in lower resolution. This saves computational power and improves performance without the user noticing a difference in quality.

Gaze and Dwell: An interaction method where a user looks at a button or object (gaze) and holds their gaze on it for a moment (dwell) to select it. It's a common hands-free way to navigate menus in XR.

Field of View (FoV): The extent of the observable world that is seen at any given moment. In XR, a wider FoV leads to a more immersive experience, as it fills more of your peripheral vision. It is typically measured in degrees.