Skip to content
Resource

What are immersive technologies?

This explainer covers virtual reality (VR), augmented reality (AR), mixed reality (MR) and immersive virtual worlds (IVWs).

Cami Rincon , Jorge Perez

5 March 2025

Reading time: 55 minutes

A man in a grey shirt is using a virtual reality headset and gesturing while his female colleague is stood next to him using a tablet computer.

Background

Immersive technologies are a group of emerging technologies that all share a common aim: to create an experience for users that mediates their perception of their physical environment. While many immersive technologies are developed for specific uses like gaming or social interaction, organisations in a diversity of sectors have adapted them for uses ranging from healthcare treatment to workplace training. Despite recent waning in investment interest in immersive technologies[1] and underperformance of major products like the Apple Vision Pro, the technical ability of these technologies to simulate virtual environments or overlay visual or aural information onto the physical environment continues to develop.

As they advance, immersive technologies have increasingly utilised more sophisticated hardware, which, in turn, collects more data points from the user and their environment, such as head and hand movements, the scanning of a person’s physical environment, and heart rate variability. The collection of this data helps users to have a more immersive experience in virtual or augmented environments. However, it also opens the door to a host of potential harms and risks related to the data lifecycle of these systems.

Given that these technologies may be used in a variety of settings, regulators may find they need to determine how to apply existing regulatory guidance or whether new guidance may be needed to address the risks – be it for medical device regulation, employment law or online safety regulation.

Effective governance of these systems requires a common understanding of their capabilities, technical components and data lifecycle. This explainer seeks to address this need and provide a common understanding of immersive technologies by describing the essential terms and concepts, as well as the kinds of data, software and hardware that are used.

We conducted a literature review and 26 interviews with researchers, developers, investors and practitioners who are using, building and deploying different kinds of immersive technologies in the UK, Europe and the US.

Fostering a shared understanding of these technologies will help policymakers and regulators to better grasp their capabilities and risks, enabling more informed and effective policymaking in this domain.

Key findings:

  • Defining immersive technologies is complicated by a lack of consistency in the field. Multiple terms – such as spatial computing, extended reality and the metaverse – are used to describe technologies that fall under the same broad category. Further complicating this categorisation is the absence of a clear definition of what makes a technology ‘immersive’. Some interviewees emphasised that immersion depends on the user’s perception of the environment, while others argued it is determined by a device’s technical capabilities and its ability to realistically simulate user interactions with an environment.
  • We identify four types of immersive technologies: augmented reality (AR), mixed reality (MR), virtual reality (VR) and immersive virtual worlds (IVWs) – the latter two of which are closely related. Some terms, such as AR and VR, seem to be consistently defined by interviewees, whereas others, such as MR and IVWs, were much more contested.
  • Despite their differences, immersive technologies often share many hardware and software elements such as cameras, motion sensors, visual display devices and rendering engines. However, this can vary significantly depending on the use case being employed.
  • Depending on their sophistication and use of wearables, immersive technologies can collect significantly more data than traditional devices such as laptops or smartphones, including physiological data, environmental data, positional data and user profile data.
  • Interviewees noted that some of this data is essential to the functioning of these systems, particularly for features such as the movements of avatars, yet they questioned the necessity of companies collecting and storing this data. The large amount of data these devices can collect alongside the increasing amounts of data required to use these devices makes this element of immersive technologies a priority concern. This issue is exacerbated by the lack of transparency in data collection and processing by many companies, including how data collected by immersive technologies may be used for other purposes. These elements allude to potential regulatory gaps.
  • The data collected through immersive technology products can consequently be used for user profiling and personalised content delivery. Retained data can also in some cases be utilised for purposes outside its direct intended use, such as training AI models. The retention and reuse of data raises significant questions about data protection, transparency, user autonomy and privacy.

This explainer will be followed by two other publications from the Ada Lovelace Institute – a discussion paper and an impact assessment report. The discussion paper will explore social factors related to the development and adoption of immersive technologies, and provide insight into the timeline and current state of play of immersive technology products, including key players, industries and use cases.

The impact assessment report will provide an in-depth evaluation of the impacts (risks and benefits) of these technologies, drawing on interviews, a literature review and deliberative workshops. This report will ground discussions around the impacts of these technologies through two use cases – immersive social worlds and the use of augmented reality technologies by warehouse employees. This final report will include policy recommendations for how UK policymakers and regulators might consider regulating these systems.

Introduction

Many people use technology to communicate, learn, work, purchase products and for entertainment. The world we live in is one where our daily interactions are increasingly mediated by technologies that curate, recommend and synthesise information. While most of this mediation happens through digital devices like a smartphone or laptop, developers of one type of technology – immersive technology – are seeking to further blur the boundaries between what happens in the physical world and digital world.

Immersive technology is a broad umbrella term for interactive technologies like virtual reality (VR), augmented reality (AR), mixed reality (MR) and immersive virtual worlds (IVWs). While each of these technologies is different in crucial ways, they share a common aim to produce an immersive and interactive experience for a user.

Technologies covered in this explainer

 

Virtual reality (VR): a computer-generated 3D virtual environment that a user interacts with through devices (typically a headset) and that reduces their awareness of the physical world.

 

Augmented reality (AR): the use of devices (such as smartphones or smart glasses) to ‘augment’ a user’s perception and interaction with the physical world (e.g. using smart glasses to display information to a warehouse worker about which aisle a shipment is in).

 

Mixed reality (MR): an immersive technology that combines both VR and AR in the same device and allows users to interact with parts of both the virtual and physical worlds.

 

Immersive virtual worlds (IVWs)/metaverses: a virtual environment where users can interact with other users, virtual objects and the virtual environment.

Large tech companies like Meta and Apple have made significant investments in these technologies. According to these companies, they promise immersive technology will ‘revolutionise’ how we work, learn, socialise and entertain ourselves.[2] But, just as with the introduction of other technologies in our lives, this comes with challenges and opportunities that require careful consideration from policymakers, technologists and society.

For many people, immersive technologies are most associated with gaming and immersive virtual worlds, sometimes referred to as ‘the metaverse’. The rebranding of the company Facebook to Meta in 2021 (bringing together its apps and technologies under one brand) catapulted the term ‘metaverse’ from science fiction novels to everyday vocabulary. During Meta’s first two years, private-sector investment in immersive virtual worlds surged.[3]

Despite initial investment, there was a relative lack of widespread adoption of immersive virtual worlds. In 2023, several companies like Microsoft, Disney and Walmart shut down or downsized their metaverse teams and products. Even Meta officials such as Chief Technology Officer Andrew Bosworth admitted that senior Meta leadership were spending more time on AI than the metaverse.[4] For the most part, developers and investors have instead turned their attention to foundation models and generative AI, which many believe promise greater potential for adoption for a wider set of business and consumer use cases.[5] However, beyond immersive virtual worlds, other immersive technologies have continued to be developed and integrated in business and consumer contexts, as exemplified by the launch of Apple Vision Pro in February 2024 and Meta’s announcement of the Orion AR glasses in September 2024.[6] Despite this relative waning of initial interest, there remains a significant amount of investment.

The risks of immersive technologies may arise in many ways – from the way these technologies function to social factors such as the dynamics among those developing these technologies, where these technologies are adopted and who they interact with. The severity and likelihood of the risks of these technologies will depend on: how they are designed and operated, what purpose they are used for, what contexts they are used in and what kind of people will be affected.[7]

This explainer seeks to address the first of these points – how are these technologies designed and operated? These contexts are important to understand, especially for policymakers and regulators, as a cursory review of these technologies shows they may raise a wide range of regulatory areas of concern.

One area is data protection. At a technical level, immersive technologies operate through a process of data creation and processing that often involves specialised hardware products that collect a significantly higher volume of data than devices such as smartphones and laptops, including a diversity of sensitive data.[8] This creates significant privacy and data-protection risks that developers must navigate,[9] which can be exacerbated by a lack of transparency about data collection and processing in immersive technology products and how data may be used for other purposes.[10]

Another area is algorithmic harms, such as AI-enabled discrimination. Immersive technologies use a range of AI algorithms to process data, which may fall under existing guidance or regulation on AI from sectoral regulators in the UK. Depending on their use case and context, some immersive technologies may fall under the remit of the EU AI Act. Examples of these risks include algorithms used within immersive technology products that have been found to perpetuate bias and discrimination,[11] such as algorithms used within VR applications to score individuals. Another example is user profiling algorithms to provide personalised content, which can raise concerns around user autonomy.[12]

Regulators will also need to consider what kinds of users may be affected and the contexts and functions for which these technologies may be deployed. Immersive technologies are currently being used for consumer and business purposes in a variety of high-impact and safety sectors including healthcare[13] and education.[14] Some of these uses are marketed towards particularly vulnerable groups like children or people experiencing health issues.[15] The high-impact nature of these contexts and the vulnerable status of some of the key users of these technologies may exacerbate the severity of potential harms. This can include the use of headsets leading to physical accidents,[16] which can be particularly dangerous in healthcare settings, or online harms such as harassment and abuse, which can be targeted towards children.[17]

Consumer products such as online gaming or immersive virtual worlds have significant overlap with social media, and fall under the remit of the Online Safety Act 2023 and consumer protection legislation. Some products may fall under the remit of existing regulation. Health or wellbeing devices, for example, may fall under medical device regulation.

One problem for the governance of immersive technologies is there is no common vocabulary or understanding of what these technologies are, how they work, what kinds of data they collect and what kinds of technical components they use. Through a shared understanding of how these technologies operate, policymakers and regulators can better judge how immersive technologies may be used in different sectors and what kinds of issues they may raise.

Glossary

Algorithm: a set of instructions for how a computational system can execute a task or solve a problem by mapping inputs to outputs.

Augmented reality (AR): the use of devices (such as smartphones or smart glasses) to ‘augment’ a user’s perception and interaction with the physical world (e.g. using smart glasses to display information to a warehouse worker about which aisle a shipment is in).

Avatar: an active virtual representation of a user in a virtual world, where the user can control the avatar through devices, sensors and controllers to move around the virtual world.

Blockchain: a type of database or digital ‘ledger’ that stores data in a distributed way.

CAVE systems: Cave automatic virtual environment (CAVE) systems are spaces using screens and projections in a room-like environment to create an immersive experience. Users can interact with the virtual environment through 3D glasses while being tracked by room sensors.[18]

Cloud computing: a method of using computer resources (such as data processing or storage) hosted remotely through the internet.

Cross-reality: the use of applications and devices that allow users to switch between interacting in the physical and virtual worlds.

Cryptocurrency: a virtual or digital currency typically run through a decentralised cryptographic system.

Digital identity: a digital representation of information about a person or organisation that allows them to prove who they are.

Digital twins: a digital model that simulates a physical object or system.

Electromyography (EMG) sensors: sensors which detect the electrical activity in muscle tissue to understand intended motion. These have been tested in wrist wearables to allow for better control in virtual environments.[19]

Extended reality (XR): a blanket term used to describe the group of emerging technologies including virtual reality (VR), augmented reality (AR), mixed reality (MR) and immersive virtual worlds (IVWs).

Facial recognition technology (FRT): the use of technology to detect a face and identify a person based on an existing database in real time by analysing faces on live video, as in live facial recognition (LFR). See the Ada Lovelace Institute’s ‘Countermeasures’ report for more detail.[20]

General-purpose AI (GPAI): an emerging type of AI capable of a range of general tasks (such as text synthesis, image manipulation and audio generation). Notable examples are OpenAI’s GPT-3 and GPT-4 models which underpin ChatGPT and many other applications via OpenAI’s application programming interface (API).

GPAI can work across many complex tasks and domains and can exhibit unpredictable and contradictory behaviour when prompted by human users. It can also be built ‘on top of’ to develop applications for many different purposes. It contrasts with ‘narrow AI’, which focuses on a specific or limited task, such as predictive text or image recognition.

GPAI systems are sometimes referred to as ‘foundation models’. See the Ada Lovelace Institute’s ‘What is a foundation model?’ explainer for more detail.[21]

Generative AI (GenAI): refers to AI systems usually but not always built ‘on top of’ general-purpose AI models that can generate new content based on user inputs or prompts. This includes generating new content such as images, video, text or audio.

Some GenAI applications are built using general-purpose AI, for example OpenAI’s DALL-E image generator, but it is important to note that not all generative AI is general-purpose AI: it can also be designed as narrow AI for specific purposes.

Haptic technologies: devices that react to, detect or simulate touch for a user. Examples include touchscreens, game controllers that vibrate, haptic gloves and haptic vests.

Headset or head-mounted display: devices worn on the head that display virtual content as well as collect data from the user (such as tracking head position or eye-tracking). Examples include VR headsets or smart glasses.

Holography: a technique that uses light to produce a three-dimensional image.

Immersive technologies: a term describing a group of technologies that aim to create an immersive experience. This typically includes emerging technologies such as virtual reality (VR), augmented reality (AR), mixed reality (MR) and immersive virtual worlds (IVWs).

Immersive virtual worlds (IVWs)/metaverses: a virtual environment where users can interact with other users, virtual objects and the virtual environment.

Mixed reality (MR): an immersive technology that combines both VR and AR in the same device and mixes parts of the virtual and physical worlds.

Motion sensors: devices which track how a user’s body is moving. For example, motion sensor tags worn on wrists.

Neurotechnologies: devices which interact with a person’s nervous system. For example, neural implants and EEG (electroencephalography) headsets.

NFTs (non-fungible tokens): a unique digital identifier recorded on a blockchain to certify ownership of a digital asset, which can be traded.

Operating systems: a system that manages the hardware and software resources of a computational system.

Pass-through: a feature on some headsets where cameras on the front of a device record the physical world and display it simultaneously to a user so it appears as if they are looking through the screen.

Platform: in this explainer, ‘platform’ is used to describe an online system providing the environment and tools for developing and managing immersive technology applications, and for users to interact with these applications.

Rendering: a software feature that updates a visual scene so it feels like a user is interacting with the environment in real time and makes movement in a virtual world feel more natural.

Scene/object generation: software which generates virtual visual representations of environments and objects for users to interact with.

Scene/object recognition: software which recognises a user’s environment and detects and identifies objects a user can see.

Smart glasses: head-mounted devices, normally worn like glasses, which use sensors and user inputs to perform other functions for a user, such as recording information. These can be AR devices, but not all smart glasses have a digital AR layer (for example, Ray-Ban Meta smart glasses).

Sound/speech recognition: software which analyses and recognises sound or speech, including a user’s voice.

Sound/speech synthesis: software which generates audio and speech content.

Spatial computing: used to describe a new type of human-computer interaction in which a computer reacts in real time to a user’s actions and environment. It is sometimes used as a blanket term instead of immersive technologies and XR, but it is not used as frequently.

Virtual reality (VR): a computer-generated 3D virtual environment that a user interacts with through devices (typically a headset) and that reduces a user’s awareness of the physical world.

Virtual world: shared and simulated digital spaces inhabited, shaped and engaged with by avatars.[22]

Virtuality-reality spectrum: a spectrum for describing the degree of immersion of different immersive technologies (see Figure 1).

Wearable technologies: devices and sensors worn by a user, which can be used to create an immersive experience.

What are immersive technologies?

Immersive technologies are a group of emerging technologies that all share a common aim: to create an experience for users that mediates their perception of their physical environment.

For the purposes of this paper, we focus on four types of immersive technologies: augmented reality (AR), mixed reality (MR), virtual reality (VR) and immersive virtual worlds (IVWs). These technologies differ in a variety of ways but share broad similarities in how they work and what components they include. They will typically use:

  • A hardware interface (such as headsets and tablets) that includes input devices (sensors) to collect data from a user and/or the physical environment, and output devices (such as visual displays) to deliver content to users.
  • Computing power, such as from smartphones, headsets or computing infrastructure.
  • Software, such as real-time processing algorithms to generate content that creates an immersive experience for a user.

Broadly speaking, immersive technologies all use input devices to collect data, software to process the data and generate content, and output devices to deliver content to users. Different kinds of immersive technologies rely on different kinds of hardware and software. These technologies can also capture different kinds of data, like biometric data and environmental data.

Why is it difficult to define immersive technologies?

One challenge with defining immersive technologies is that companies and researchers use different terms to describe the same broad grouping of technologies. These include umbrella terms such as extended reality (XR) and spatial computing.

Table 1: Different terms used interchangeably with immersive technologies

Term Definition Who uses it
Spatial computing Used to describe a new type of human-computer interaction in which a computer reacts in real time to a user’s actions and environment.[23]

 

Apple
Extended reality (XR) A blanket term used to describe augmented reality (AR), virtual reality (VR), mixed reality (MR) and everything in between.[24]

 

Qualcomm
Metaverse A three-dimensional online environment in which users represented by avatars interact with each other in virtual spaces decoupled from the real physical world.[25]

 

Meta
Virtual world Shared and simulated digital spaces inhabited, shaped and engaged with by avatars.[26]

 

PlayStation/Sony

Many interviewees criticised the lack of standardised language, which can lead to confusion and frustration:[27] ‘The language in this area keeps changing all the time. It was immersive, then it was VR or XR, now Apple is using spatial computing […] it’s always a pain and the language changes very often.’[28] This can mislead people to think these terms refer to different technologies, when they are often used to refer to the same set of technologies.

It can also be challenging to define what constitutes an immersive technology, as there are disagreements in the academic literature and among developers about what qualifies a technology as ‘immersive’.

Some practitioners focus on the impact that a technology or set of technologies has on a user’s perception of their environment. In this view, technologies are immersive if their use results in an experience of feeling present in an environment other than one’s physical environment.[29] This could be through headsets that affect what a user perceives or the creation of immersive spaces (such as CAVE systems).[30]

For several interviewees,[31] user experience is key for defining whether a technology is immersive – even while recognising that this approach can be vague. One interviewee stated: ‘Immersive is a bit of a tricky word […] generally the aspiration is to have an experience where you might be putting people into “flow states” where they’re immersively, emotionally involved in something.’[32] Some participants compared this to reading a book, which can be an immersive experience for a reader.[33] ­­

However, other interviewees disagreed with this comparison and suggested that while immersive technologies can trick your body into feeling something that is not there, an immersive book will not create the same sensation.[34] This disagreement highlights how ‘immersion’ is a contested concept with different meanings,[35] making it difficult to decide whether a technology qualifies based on user experience.

Other practitioners instead focus on the technical features of a technology when qualifying it as immersive. For example, some interviewees focused on real-time processing as a defining feature of immersive technologies, in which data is processed and immersive content is rendered at the same time as a person is using a technology.[36]

Some interviewees[37] also discussed what makes technologies more or less immersive in terms of how well they are able to realistically simulate the user interacting with an environment in the physical world.[38] For example, someone with this perspective may say that watching a film using a VR headset is more immersive than watching with 3D glasses, because the former is better able to simulate the experience of a user being in a scene of a film.[39]

The interviewees did not come to a clear consensus on how to define the unique features of immersive technologies. However, two important themes emerged. First, immersive technologies strive to create an immersive experience for a user, even if this is not always fully achieved. Second, the ability to do so depends on the various technical capabilities of a system, including the collection of user data through sensors and real-time processing, to create a seamless immersive user experience.

What are the different types of immersive technologies?

In this explainer, we focus on and distinguish between four types of immersive technologies:

  1. augmented reality (AR)
  2. mixed reality (MR)
  3. virtual reality (VR)
  4. immersive virtual worlds (IVWs).

We explore how each of these technologies work and their data lifecycle – what kinds of data they collect, how data is processed and how it can be reused and retained – as a basis for understanding the potential risks and impacts emerging from the way immersive technologies work.

The differences between these technologies were often described by interviewees[40] and in academic literature as based on a spectrum of how much the user loses awareness of the physical world, called the ‘virtuality–reality spectrum’ (Figure 1).[41]

One interviewee cautioned that language used when discussing immersive technologies should not assume that virtual experiences are not real, as this can diminish the real harms that occur in virtual worlds, such as harassment.[42] To reflect this, throughout this explainer we contrast the virtual world with the physical world, rather than describing the physical world as the ‘real world’.

Figure 1: The virtuality–reality spectrum

What is augmented reality (AR)?

In augmented reality (AR), a user changes or ‘augments’ their perception and interaction with the physical world through a device. On the virtuality–reality spectrum, AR is closest to physical environments as users retain more contact with the physical world than through the other technologies explored in this explainer. For example, if a user is wearing an AR headset they will still be able to directly see through to the physical world, or if using a tablet or phone their vision is not restricted to what they can see on the screen.

Reflecting on this connection to the physical world, several interviewees described AR as overlaying digital information or adding a ‘digital layer’ to the physical world.[43] Interviewees gave several examples of how this digital layer could be added through smartphone apps (such as Pokémon Go,[44] Google Lens, Instagram or Snapchat filters[45]) or through specific devices such as AR smart glasses.[46]

In terms of hardware, AR is less strongly associated with headsets than other immersive technologies, with some interviewees describing AR as largely reliant on devices such as tablets and smartphones.[47] Smart glasses can also be used for AR. Some smart glasses provide a visual overlay, such as the Viture Pro XR,[48] and others deliver an audio overlay, such as the Ray-Ban Meta smart glasses. The latter provides an immersive experience despite not providing a visual overlay, because it uses input devices and software to collect and process data, and output devices to deliver content that alters the user’s auditory experience of their physical environment. For example, people can use the glasses’ built-in camera (input device) and personal assistant (software) to ask questions about objects in their environment.[49]

Some interviewees noted that QR codes,[50] a kind of barcode that can typically be ‘read’ by a smartphone to connect to websites, are commonly used as a method for accessing AR experiences. For example, supermarkets or retailers such as Nike have used QR codes in stores to facilitate AR experiences where users can ‘try on’ clothes virtually on their phone.[51] This highlights the common use of AR as a tool for interacting with digital content and accessing information about the physical world, with participants highlighting other cases such as the use of QR codes to access AR virtual guides for navigating cities.[52]

Figure 2: Augmented reality

What is mixed reality (MR)?

Mixed reality (MR) is an immersive technology that allows users to interact with elements of the virtual and physical worlds.[53] On the virtuality–reality spectrum, MR sits between AR and VR and mixes virtual content with connection to the physical world.

MR allows the user to interact with both the physical and virtual worlds at the same time through the use of headsets. This is done by combining aspects of both VR[54] and AR in the same device. While the hardware used for MR will typically be the same as a VR headset, MR differs from VR because it allows the user to still interact with the physical world. This is achieved with a feature called pass-through, where cameras on the front of a device record the physical environment and display this in real time in the headset so it appears as if the screen is transparent. However, unlike AR, the user is not directly interacting with the physical world.

Disagreements over MR

The term mixed reality (MR) does not clearly signify a specific type of technology or user experience for everyone, with some arguing we do not need the term at all.[55] Previous research has shown that some experts interpret MR as a catch-all term for any type of experience that exists between VR and AR, while others believe we will not make distinctions between different types of immersive technology in the future.[56]

One interviewee described this coalescence as ‘cross-reality’, where ‘applications can move across the gradient of immersion’ and users switch between interacting in the physical and virtual worlds.[57]

There is also a discrepancy between the terminology used by companies to describe their immersive technology products and how interviewees referred to their devices.

For example, the Apple Vision Pro headset would qualify as an example of MR under our definition, while Apple uses the term ‘spatial computing’. Unlike traditional computing, where a user inputs information and interacts through a digital interface, spatial computing collects and processes information about a user and their environments to perform computing tasks.

However, spatial computing is not a commonly used term in other parts of the tech industry. Most interviewees did not use the term and, when they did use it, it was often in reference to Apple products. One interviewee described this terminology choice as a way for Apple to avoid ‘all the baggage that comes with virtual reality’, explaining how ‘it’s just a matter of Apple trying to name things more for branding purposes […] at the end of the day, the Apple Vision Pro is a virtual reality headset with mixed reality pass-through’.[58]

Similarly, some devices which are typically referred to as ‘mixed reality headsets’ could also fall under AR. For example, Microsoft markets the HoloLens as a MR device. However, as the HoloLens overlays holographic images onto the physical world, it would fit into what is more typically considered AR.

Figure 3: Mixed reality

What is virtual reality (VR)?

In virtual reality (VR), a single user is immersed in a computer-generated 3D environment, which substantially reduces their awareness of the physical world. VR sits at the end of the virtuality–reality spectrum, as a user is furthest away from the physical environment and immersed in a virtual environment. This is typically achieved through head-mounted displays (often called headsets), which provide audio and visual information to make a user’s experience more immersive.

VR environments provide single-user experiences, such as engaging in VR education or therapy applications. These environments can closely mimic physical environments or create cinematic, artistic or fantastical experiences. Several interviewees, for example, described VR as a virtual space where users could theoretically undertake everyday tasks such as watching television, going to work or shopping.[59] Other examples of VR environments include ones which do not mimic real-life environments, such as Isness, which aims to provide therapeutic services through psychedelic experiences.[60]

While some VR headsets, such as the PlayStation VR2 and Pico 4 headsets, can include features such as pass-through, these are typically used for convenience rather than as part of the immersive experience. For example, to allow a user to quickly look through a VR headset to pick up controllers while gaming. In these cases, this would not qualify as MR because a user is not able to interact with elements of the physical and virtual worlds at the same time.

Figure 4: Virtual reality

What is an immersive virtual world (IVW)?

While VR technologies provide single-user experiences, an immersive virtual world (IVW) or metaverse is a type of VR that provides multi-user experiences and facilitates a more social virtual experience. In these virtual environments, a user uses avatars to interact with other users, virtual objects and the virtual environment. An avatar is an active virtual representation of an individual in a virtual world, which can be controlled by the user through devices, sensors and controllers. Avatars can be cartoon-like, as in video games, or photorealistic.

Where IVWs fall on the virtuality–reality spectrum depends on what devices are used. While VR headsets can be used to access IVWs, as per our definition, this is not the only way to access them. The immersiveness of IVWs also depends on the device used. For example, accessing IVWs through a desktop will provide a much less immersive experience than a see-through MR headset. Similarly, these will offer less immersion than VR headsets with closed backs. Some IVWs can be accessed through all three devices (VR, MR and desktop computers). VRChat, for example, offers different levels of immersion depending on the device through which the IVW is accessed.

Immersive virtual worlds can vary in scale, from a small virtual environment for a specific community to a larger virtual environment open to anyone on the internet. One interviewee argued that immersive virtual worlds do not need to be interconnected: ‘A lot of people say Roblox isn’t a real metaverse, it’s not interoperable. What I would say is I think about it not just as the technology but I also think about the user experience and what are the users doing.’[61] This challenges the idea that IVWs need to be interoperable, meaning they can be accessed through adjacent IVWs or be of a certain scale, and instead includes virtual worlds that are smaller in scope and are not connected to other worlds. IVWs discussed by interviewees included Meta’s Horizon Worlds,[62] Roblox,[63] VRChat[64] and Rec Room.[65]

Disagreements over IVWs

Interviewees used a range of terms to describe IVWs, such as immersive environments, virtual worlds, immersive spaces and immersive reality. However, ‘metaverse’ was used the most frequently, which mirrors what most people probably associate with IVWs.

There are mixed views on whether IVWs can be accessed through hardware such as laptops or require VR headsets to count as immersive. Some interviewees suggested IVWs could be accessed in many ways, such as by a mobile device or a video game, rather than just through a VR headset.[66] As one participant put it, these inputs are a kind of ‘means to an end’ for IVWs, with the end being immersive social experiences.[67] Other interviewees disagreed that IVWs that do not require VR headsets would qualify as immersive: ‘[There are] platforms that people call proto-metaverses like Roblox and Minecraft [that] you can access on a computer or tablet. But I wouldn’t necessarily call those immersive.’[68]

Figure 5: Immersive virtual worlds

How do immersive technologies work?

Every kind of immersive technology relies on a common set of components and features. It is critical that policymakers understand these components, how they work and how they can enable certain kinds of risks. A clear picture of how immersive technologies work can help with assessing what aspects are covered by existing law, what gaps exist in regulatory powers and what risks may require new controls and methods of governance.

Immersive technologies function through an iterative process of data creation and processing that enables the production of immersive content. This process relies on user interactions with hardware and sensors to create and collect data. This data is then used as inputs which are processed by software to produce immersive content as outputs. These outputs are then delivered through output devices.

What hardware is used?

Immersive technologies rely on a range of hardware products. This can include hardware that is designed for a wide variety of purposes, such as smartphones, as well as hardware that is custom developed to create immersive experiences, like VR headsets.

Augmented reality (AR) systems, for example, frequently draw on hardware such as smartphones or tablets. However, since smartphones and tablets are not attached to the body, there are limitations on what or how much data can be collected when compared to wearable technologies.

Figure 6: Traditional hardware

More immersive experiences like virtual reality (VR) or mixed reality (MR) will often rely on wearable technologies. These allow for more data to be collected from users’ bodies (through tracking head position or eye-tracking, for example), which is then analysed and used to render new images, audio, or spatial mappings of an environment. For example, users can access Meta’s Horizon Worlds through their mobile device, web browser or VR headset.[69] However, only the VR headset can collect data on a user’s head movements and eye-tracking, which increases the degree of immersion experienced through its enclosed display.

Figure 7: Head-mounted displays: AR glasses

Head-mounted displays or headsets are the most common example of wearable technologies designed for immersive experiences and to display virtual content. Examples of head-mounted displays include VR headsets, MR headsets and smart glasses. Another common example of wearable technologies are controllers, which are operated by users to interact with immersive experiences.

Figure 8: Immersive technology hardware

All immersive technology hardware contains input and output devices. Input devices collect data for processing through the product’s software, and output devices deliver immersive content to a user such as video, audio and haptic outputs. Different configurations and types of input and output devices may be used for different immersive technologies, and some components are more commonly found across some immersive technologies than others.

Cameras, for example, are found across immersive technologies, while some forms of haptic devices (which react to, detect or simulate touch for a user) such as gloves and vests may not be used in hardware that is lower on the virtuality–reality spectrum, such as smartphone-based AR systems.

Variations between components may also be found across the same type of hardware. In addition to having double the number of cameras as the Meta Quest 3, the Apple Vision Pro headset has eye-tracking capabilities, whereas the Meta Quest 3 does not.[70] Equally, some input and output devices may be sold as additional products that a user can add to enhance an immersive experience. For example, while most VR headsets will have motion sensors built into them, a user could also purchase additional devices like sensor tags worn on wrists and limbs, or exoskeletons and treadmills used to control movement and increase the degree of immersion within a virtual environment.

What data is created and collected?

Immersive technologies that use wearable devices significantly increase the amount of data that can be collected from users and a user’s environment. This includes[71] but is not limited to:

  • Physiological data (for example, head movement and heart rate)
  • Environmental data (for example, object recognition and sound tracking)
  • Positional data (for example, a user’s position and movement)
  • User profile data (for example, demographic data and usage data).

Figure 9: Types of data collected by immersive technologies 

As with input and output devices, the specific forms of data collected by immersive technology products will depend on the product and use case. For example, Pokémon Go, an AR game, collects personal information about a user’s digital footprint, such as their email address, IP address and location,[72] but does not have the capacity to collect personal biometric data, such as hand-tracking, which the MetaQuest 3, a VR headset, does. On the other hand, while both MR and VR headsets collect environmental data, MR devices like the Apple Vision Pro gather higher-fidelity spatial information about a user’s environment through LiDAR technology, enabling more precise environmental mapping compared to the Time-of-Flight sensors typically used in VR headsets such as the Meta Quest 3’s data collection systems.[73]

What software is used?

Figure 10: Immersive technology software

Immersive technologies use software programmes to process data collected by input devices, and to deliver immersive content to users through output devices. Some examples of the different kinds of software that immersive technologies can use include:

  • Operating systems
    • At a foundational level, immersive technologies rely on operating systems that power hardware devices like smartphones, tablets or computers. These include Windows OS and Apple’s VisionOS which support all other software programmes, including immersive technology platforms.
  • Platforms
    • Immersive technology platforms provide environments and tools for developers to develop and manage applications, and for users to access applications.
  • Applications
    • Applications are services offered within platforms. Examples include games, content creation tools and social media. At times, platforms can also serve as applications through which users can interact with other avatars and access different games and functionalities, such as Meta’s Horizon Worlds.[74]
  • Rendering engines
    • Immersive technologies rely on rendering systems to deliver content, which may take the form of a virtual overlay in AR systems, or mixed or virtual environments in MR or VR systems.[75] These systems use real-time rendering engines as a key software to enable input data to be processed, and outputs to be generated as users interact with the technology.[76] Real-time rendering software allows immersive content to be updated at a high speed that can be perceived by users as a continual stream of content responding to their actions. As a user’s experience is rendered in real time, it closely replicates physical experiences and helps foster a ‘flow state’ for users that increases the degree of immersion of the experience.
  • Algorithms
    • Immersive technology software also uses algorithms for processing data and rendering immersive content. Algorithms act as instructions for how data should be used by immersive technology software. Often, the algorithms used by immersive technology software are machine-learning algorithms – it is here that immersive technologies converge with AI.

For example, the Meta Quest 3 headset runs on the operating system Meta Horizon OS,[77] a MR operating system built off Google’s open-source Android OS.[78] In turn, players can interact with each other through the Horizon Worlds platform, which in itself is both an application and a platform through which users can access other applications, such as Gizmos.[79] Horizon Worlds and other platforms for the Meta Quest 3 can be developed using Unity and Unreal Engine rendering engines.[80] One of the many algorithms used by the Meta Quest 3 are computer vision algorithms for its Inside-Out Body Tracking (IOBT) system, which allows the system to map out upper-body movements, heightening the immersion felt by the user.[81]

What types of AI algorithms are used?

The convergence between immersive technologies and AI was seen as particularly important in our interviews, with over half of the interviewees discussing it.[82] While AI systems themselves are not immersive technologies, they can often play a role in supporting the development and use of immersive technologies. For example, interviewees discussed how AI can be used to support immersive technology through content generation,[83] for object recognition,[84] to personalise immersive technology applications[85] and for speech-to-text translation into different languages to make immersive experiences more accessible.[86]

Many forms of convergence discussed by interviewees preceded the current surge of generative AI and large language models (LLMs). Some examples of these machine-learning algorithms include ones that aim to recognise information from physical environments:

  • Scene and object recognition algorithms are key to AR and MR devices which use data from the physical world to recognise a user’s environment and identify objects within it.
  • Sound and speech recognition algorithms analyse sounds or users’ voices and other sounds to recognise sound and speech. This can be important for voice commands or for communication with other users.
  • Facial recognition algorithms detect faces in an environment and identify them based on an existing database. They have not been incorporated into mainstream commercial products due to privacy concerns, but they have been in immersive technology products used by police forces around the world.[87]

Other examples include profiling algorithms, which create data about users by categorising them into psychological, behavioural and demographic groups, also referred to as psychographic characteristics.[88] Data created by profiling algorithms can be added to a user’s profile data and can be used as an input for other algorithms, such as content generation algorithms. Profiling algorithms include:[89]

  • Emotion recognition algorithms which make inferences about a user’s supposed emotional states and emotive responses to content. This is based on data including physiological data such as facial expressions, heart rate and pupil dilation.
  • User preference recognition algorithms which make inferences about a user’s supposed preferences. This is based on data including physiological data and usage data, such as time spent engaging with specific content.
  • Gender recognition algorithms make inferences about a user’s supposed gender based on data including behavioural data.

While some forms of AI used within immersive technologies precede recent developments in generative AI, the uptake of generative AI systems was discussed as particularly useful for rendering graphic and audio content for immersive experiences. Developments in this area could streamline the content creation process for immersive experiences.[90] Algorithms that aim to render immersive content include:

  • Scene generation, object generation and spatial mapping algorithms create virtual representations of environments and objects for users to interact with when using immersive technologies. Interviewees discussed the use of these algorithms for real-time rendering as an area with significant potential.[91] One interviewee, for example, emphasised how the ‘convergence between real-time graphics computing and AI’ holds potential for people to generate immersive content more quickly.[92]
  • Sound and speech synthesis algorithms create the auditory aspects of a virtual environment, making it more immersive. Some interviewees highlighted the use of generative AI for delivering content in the form of avatars[93] or through personal assistants integrated within smart glasses,[94] which may provide a ‘different way [of] thinking about human computer interactions’,[95] where users ‘talk to things and they do it for you’.[96]

How is data managed?

How immersive technology products handle data may vary. Here, we present a model illustrating how data is managed to operationalise immersive technologies. This ‘data lifecycle’ model, presented in Figure 11, provides an overview of the stages of how data is managed, from its point of collection to its point of decommission. This consists of:

  1. data creation and collection
  2. data use
  3. data retention.

Figure 11: The data lifecycle of immersive technologies

Conclusion

Defining immersive technologies is complicated by the lack of consistent vocabulary in the field and the lack of a clear definition of what makes technology ‘immersive’. Despite this, we identify four types of immersive technologies: augmented reality (AR), mixed reality (MR), virtual reality (VR) and immersive virtual worlds (IVWs).

The differences between these technologies can be gauged through their position on the virtual–reality spectrum, which places these technologies according to how much the user loses awareness of the physical world. These technologies often share many hardware and software elements such as cameras, motion sensors, visual display devices and rendering engines. Depending on their sophistication, these devices can collect significantly more data than devices such as laptops or smartphones. The convergence between immersive technologies and AI stands out as a particularly important element of these devices, with several interviewees noting the importance of AI algorithms (such as profiling algorithms and scene and object recognition algorithms) playing a crucial role in the development of immersive technologies.

This explainer aims to provide the public, alongside policymakers and regulators, with a comprehensive understanding of immersive technologies by examining their contested terminology, functionalities and technological architecture. Through this, we strive to facilitate discussions in policy and the development of nuanced approaches to their governance and regulation that address both the opportunities and challenges they pose.

Methodology

This explainer is the first of three outputs based on research conducted for the Ada Lovelace Institute project: ‘Return to reality: An exploration of immersive technologies’. The explainer answers the following research question:

  • What are the essential terms and concepts that policymakers and regulators need to understand to address the impacts of immersive technologies?

To explore this question, we used two research methods:

  • Desk research and a literature review
  • Expert interviews

Our literature review was conducted between November 2023 and May 2024. An initial review of annotated articles resulted in a set of cluster themes. We then identified priority areas and gaps from our initial review, which we supplemented by searching for additional relevant papers and articles.

The project team conducted 26 interviews with experts working with immersive technologies, including developers, investors, academics and practitioners. Interview questions covered the following categories relevant to our project’s research questions: the timeline of immersive technologies, the product landscape, technical components, and the impacts of immersive technologies such as risks and benefits.

Acknowledgements

This report was authored by Cami Rincon, Mahi Hardalupas, Hannah Claus and Jorge Perez. The authors would like to give a special thanks to Emmie Hine for their comments and substantive contributions.

This work was made possible by a grant from the Minderoo Foundation as part of their XR30 Fund programme.

Appendix

What other technologies are used?

Many interviewees referred to technologies that immersive technologies converge with, use or rely on, even if they would not count them as immersive technologies themselves.

  • Blockchain:[97] a type of database or digital ‘ledger’ that stores data in a distributed way. Given the decentralised nature of this technology, several immersive virtual worlds have implemented blockchain to support content protection, transaction security and content hosting.[98]
  • Cloud computing:[99] a method of using computer resources (such as data processing or storage) hosted remotely through the internet, such as Amazon Web Services.[100] Cloud computing is increasingly being implemented in immersive technologies to allow users to stream otherwise graphically demanding XR experiences from less computationally capable devices.[101]
  • Cryptocurrency:[102] a virtual or digital currency typically run through a decentralised cryptographic system. In some virtual worlds, users may use cryptocurrencies to trade and sell digital assets.[103]
  • Digital twins:[104] a digital model that simulates a physical object or system. Digital twins are often used in the manufacturing sector, where, alongside immersive technologies, they can be used to virtually test changes to manufacturing parts. For example, BMW has utilised digital twins of car parts to virtually test changes prior to their physical implementation.[105]
  • Holography:[106] a technique that uses light to produce a three-dimensional image. Holographs are often used for AR experiences such as Microsoft’s HoloLens[107],and, more recently, Meta’s Orion glasses.[108]
  • Neurotechnologies:[109] devices which interact with a person’s nervous system such as neural implants and electromyography (EMG) headsets. These devices are often used to enhance immersive experiences such as Meta’s EMG wristband, which reads muscle signals to enable virtual control.[110]
  • NFTs:[111] a unique digital identifier recorded on a blockchain to certify ownership of a digital asset, which can be traded. Within immersive technology platforms, these may be used to provide and secure ownership of digital assets, such as virtual real estate.[112]

Footnotes

[1] Anthony Marshall and Christian Bieck, ‘Metaverse: The Post–Hype Future’ (2024) 52 Strategy and Leadership 17, 21.

[2] ‘Introducing Apple Vision Pro: Apple’s First Spatial Computer’ (Apple Newsroom) <https://www.apple.com/uk/newsroom/2023/06/introducing-apple-vision-pro/> accessed 16 December 2024.

[3] McKinsey & Company, ‘Value Creation in the Metaverse’ (2022) <https://www.mckinsey.com/capabilities/growth-marketing-and-sales/our-insights/value-creation-in-the-metaverse> accessed 20 February 2024.

[4] Ed Zitron, ‘RIP Metaverse, We Hardly Knew Ye’ (Business Insider) <https://www.businessinsider.com/metaverse-dead-obituary-facebook-mark-zuckerberg-tech-fad-ai-chatgpt-2023-5> accessed 2 August 2024.

[5] ‘Venture Capital Funding for Metaverse Dries Up’ (S&P Global, 27 February 2024) <https://www.spglobal.com/marketintelligence/en/news-insights/latest-news-headlines/venture-capital-funding-for-metaverse-dries-up-80408207> accessed 4 September 2024.

[6] ‘Introducing Orion, Our First True Augmented Reality Glasses’ (Meta, 25 September 2024) <https://about.fb.com/news/2024/09/introducing-orion-our-first-true-augmented-reality-glasses> accessed 3 October 2024.

[7] ‘Human Rights, Democracy, and the Rule of Law Impact Assessment for AI Systems (HUDERIA)’ (The Alan Turing Institute) <https://www.turing.ac.uk/research/research-projects/human-rights-democracy-and-rule-law-impact-assessment-ai-systems-huderia> accessed 13 December 2024.

[8] P15

[9] ‘UNICRI: United Nations Interregional Crime and Justice Research Institute’ <https://unicri.it/Publication/Gaming-and-the%20Metaverse> accessed 13 December 2024.

[10] P2, P9, P12, P15, P17, P22

[11] Marcus Carter and Ben Egliston, ‘What Are the Risks of Virtual Reality Data? Learning Analytics, Algorithmic Bias and a Fantasy of Perfect Data’ (2021) New Media & Society 25 <https://journals.sagepub.com/doi/full/10.1177/14614448211012794> accessed 10 May 2021.

[12] P6, P7, P11, P15, P19, P23, P24

[13] ‘Virtual Reality Technology to Treat Agoraphobia Approved for Use in the NHS’ (Oxford Health NHS Foundation Trust, 15 November 2023) <https://www.oxfordhealth.nhs.uk/news/virtual-reality-technology-to-treat-agoraphobia-approved-for-use-in-the-nhs/> accessed 5 September 2024.

[14] ‘VR for Education – The Future of Education’ (Immersion VR) <https://immersionvr.co.uk/about-360vr/vr-for-education/> accessed 13 December 2024.

[15] ‘How the Metaverse Can Transform Education’ (Meta, 12 April 2023) <https://about.fb.com/news/2023/04/how-the-metaverse-can-transform-education/> accessed 16 October 2024.

[16] Carlos Bermejo Fernandez and Pan Hui, ‘Life, the Metaverse and Everything: An Overview of Privacy, Ethics, and Governance in Metaverse’ (arXiv, 24 March 2022) <http://arxiv.org/abs/2204.01480> accessed 25 October 2023.

[17] ‘UNICRI: United Nations Interregional Crime and Justice Research Institute’ <https://unicri.it/Publication/Gaming-and-the%20Metaverse> accessed 13 December 2024.

[18] Siddhesh Manjrekar and others, ‘CAVE: An Emerging Immersive Technology – A Review’ (UKSim-AMSS 16th International Conference on Computer Modelling and Simulation, Cambridge UK, 2014) <http://ieeexplore.ieee.org/document/7046051/> accessed 6 September 2024.

[19] ‘Inside Facebook Reality Labs: Wrist-Based Interaction for the next Computing Platform’ (Tech at Meta, 18 March 2021) <https://tech.facebook.com/reality-labs/2021/3/inside-facebook-reality-labs-wrist-based-interaction-for-the-next-computing-platform/> accessed 16 February 2024.

[20] ‘Countermeasures: The Need for New Legislation to Govern Biometric Technologies in the UK’ (Ada Lovelace Institute, 29 June 2022) <https://www.adalovelaceinstitute.org/report/countermeasures-biometric-technologies/> accessed 21 March 2023.

[21] Elliot Jones, ‘What Is a Foundation Model?’ (Ada Lovelace Institute, 17 July 2023) <https://www.adalovelaceinstitute.org/resource/foundation-models-explainer/> accessed 1 August 2023.

[22] Carina Girvan, ‘What Is a Virtual World? Definition and Classification’ (2018) 66 Educational Technology Research and Development.

[23] Simon Greenwold, ‘Spatial Computing’ (2003) <https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=085e3525e97e3aa9955ee7e864f953ed344b2230> accessed 30 January 2024.

[24] ‘Extended Reality XR’ <https://www.qualcomm.com/research/extended-reality> accessed 11 December 2024.

[25] ‘Defining the Metaverse: A Systematic Literature Review’ (February 2023) <https://ieeexplore.ieee.org/document/10035386> accessed 11 December 2024.

[26] Carina Girvan, ‘What Is a Virtual World? Definition and Classification’ (2018) 66 Educational Technology Research and Development.

[27] P8, P9, P12, P13, P19, P21, P23, P26

[28] P23

[29] Sarvesh Agrawal, ‘Defining Immersion: Literature Review and Implications for Research on Audiovisual Experiences’ (2020) 68 J. Audio Eng. Soc. Other examples in Ayoung Suh and Jane Prophet, ‘The State of Immersive Technology Research: A Literature Analysis’ (2018) 86 Computers in Human Behavior 77.

[30] Siddhesh Manjrekar and others, ‘CAVE: An Emerging Immersive Technology – A Review’ (UKSim-AMSS 16th International Conference on Computer Modelling and Simulation, Cambridge UK, 2014) <http://ieeexplore.ieee.org/document/7046051/> accessed 6 September 2024.

[31] P4, P7, P10, P18, P23

[32] P23

[33] P4, P7, P23

[34] P7, P24

[35] Magdalena Balcerak Jackson and Brendan Balcerak Jackson, ‘Immersive Experience and Virtual Reality’ (2024) 37 Philosophy & Technology 19.

[36] P13, P23

[37] P4, P24

[38] Mel Slater, ‘Immersion and the Illusion of Presence in Virtual Reality’ (2018) 109 British Journal of Psychology 431.

[39] P4

[40] P4, P12, P14, P19, P24, P26

[41] Paul Milgram and others, ‘Augmented Reality: A Class of Displays on the Reality-Virtuality Continuum’ (1994) 2351 Telemanipulator and Telepresence Technologies.

[42] P19

[43] P9, P13, P14, P24

[44] P5, P20, P21

[45] P10, P14, P21

[46] P5, P7, P13, P19, P20, P24

[47] P5, P14

[48] ‘VITURE Pro XR: Experience Next-Gen Clarity’ <https://pro.viture.com/> accessed 11 December 2024.

[49] ‘Ask Meta AI about What You See on Ray-Ban Meta Smart Glasses’ <https://www.meta.com/en-gb/help/smart-glasses/articles/voice-controls/ask-meta-ai-about-what-you-see-ray-ban-meta-smart-glasses/> accessed 11 December 2024.

[50] P14, P20, P26

[51] Laura Mullan, ‘Nike Unveils New Augmented Reality Technology to Improve Shoe Sizing’ (Technology Magazine, 17 May 2020) <https://technologymagazine.com/data-and-data-analytics/nike-unveils-new-augmented-reality-technology-improve-shoe-sizing> accessed 6 January 2025.

[52] P20

[53] This corresponds with how some interviewees described MR as mixing elements of the virtual and physical world (P5, P24, P26).

[54] See glossary definition of virtual reality (VR).

[55] Patrick Grady, ‘There’s No Such Thing as “Mixed Reality”’ (Metaverse EU, 9 July 2024) <https://metaverseeu.substack.com/p/theres-no-such-thing-as-mixed-reality> accessed 9 July 2024.

[56] Maximilian Speicher, Brian D Hall and Michael Nebeling, ‘What Is Mixed Reality?’ (Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow UK, 2019) <https://dl.acm.org/doi/10.1145/3290605.3300767> accessed 18 January 2024.

[57] P4

[58] P12

[59] P7, P13, P20, P26

[60] Rachel Freire, ‘ISNESS’ <http://www.rachelfreire.com/isness> accessed 12 December 2024.

[61] P25

[62] P12, P19

[63] P7, P8, P10, P19, P20, P25

[64] P17, P19

[65] P10, P19, P25

[66] P3, P10, P14, P21, P23

[67] P10

[68] P19 – Need to ask permission to quote

[69] ‘Meta Horizon Worlds’ <https://horizon.meta.com/> accessed 25 November 2024.

[70] ‘Apple Vision Pro’ <https://www.apple.com/uk/apple-vision-pro/> accessed 21 November 2024; ‘Meta Quest 3: New Mixed Reality VR Headset’ <https://www.meta.com/gb/quest/quest-3/> accessed 21 November 2024.

[71] Mostly drawn from ‘The IEEE Global Initiative on Ethics of Extended Reality (XR) Report – Extended Reality (XR) and the Erosion of Anonymity and Privacy’.

[72] ‘The Council Spotlight: All the Data Pokémon Go Is Collecting from Your Phone’ (The Council of Insurance Agents & Brokers) <https://www.ciab.com/resources/spotlight-data-pokemon-go-collecting-phone/> accessed 21 November 2024.

[73] ‘Apple Vision Pro and Meta Quest Non-Destructive Teardown’ <https://www.lumafield.com/article/apple-vision-pro-meta-quest-pro-3-non-destructive-teardown> accessed 22 November 2024.

[74] ‘Meta Horizon Worlds’ <https://horizon.meta.com/> accessed 25 November 2024.

[75] Handa Mandeep, Aul Er. Gagandeep and Bajaj Shelja, ‘Immersive Technology – Uses, Challenges and Opportunities’ (2012) International Journal of Computing & Business Research <http://researchmanuscripts.com/isociety2012/12.pdf>.

[76] P13, P23

[77] ‘Introducing Our Open Mixed Reality Ecosystem’ (Meta, 22 April 2024) <https://about.fb.com/news/2024/04/introducing-our-open-mixed-reality-ecosystem/> accessed 25 November 2024.

[78] ‘Android Open Source Project’ <https://source.android.com/> accessed 25 November 2024.

[79] ‘Oculus’ <https://www.oculus.com/horizon-worlds/learn/intro-to-gizmos/?locale=en_GB> accessed 11 December 2024.

[80] ‘Before You Begin | Meta Horizon OS Developers’ <https://developers.meta.com/horizon/documentation/unity/unity-before-you-begin/> accessed 25 November 2024; ‘Unreal Engine | Meta Horizon OS Developers’ <https://developers.meta.com/horizon/documentation/unreal/unreal-engine/> accessed 25 November 2024.

[81] ‘How AI Is Powering Meta’s Technologies Today + in the Future’ <https://www.meta.com/blog/quest/ai-powered-technologies-quest-3-pro-ray-ban-meta-smart-glasses> accessed 25 November 2024.

[82] P1, P2, P5, P7, P8, P9, P12, P13, P14, P15, P20, P22, P23, P24, P25, P26

[83] P1, P7, P22, P23, P25, P26

[84] P5, P9, P15

[85] P20, P24

[86] P8

[87] James Vincent, ‘Facial Recognition Smart Glasses Could Make Public Surveillance Discreet and Ubiquitous’ (The Verge, 10 June 2019) <https://www.theverge.com/2019/6/10/18659660/facial-recognition-smart-glasses-sunglasses-surveillance-vuzix-nntc-uae> accessed 16 February 2024.

[88] P12

[89] Pier Paolo Tricomi and others, ‘You Can’t Hide Behind Your Headset: User Profiling in Augmented and Virtual Reality’ (2023) 11 IEEE Access 9859.

[90] Redefine Marketing Group, ‘Generative AI in Immersive Learning: Enhancing VR’ (Strivr, 14 June 2023) <https://www.strivr.com/blog/role-genai-immersive-learning-vr-training-experiences/> accessed 16 February 2024.

[91] P1, P5, P7, P8, P9, P11, P13, P15, P20, P22, P23, P25, P26

[92] P13

[93] P11, P13, P20, P23, P24

[94] P9, P12, P12, P23, P24

[95] P23

[96] P13

[97] P1, P4, P8, P19, P25, P26

[98] LCX Team, ‘Role of Blockchain Technology in Extended Reality(XR)’ (LCX, 29 March 2024) <https://www.lcx.com/role-of-blockchain-technology-in-extended-realityxr/> accessed 26 November 2024.

[99] P15

[100] ‘Cloud Computing Services – Amazon Web Services’ <https://aws.amazon.com/> accessed 25 November 2024.

[101] ‘Immersive Stream for XR Overview’ <https://cloud.google.com/immersive-stream/xr/docs/concept> accessed 26 November 2024.

[102] P8, P19, P25, P26

[103] ‘Welcome to Decentraland’ <https://decentraland.org/> accessed 25 November 2024.

[104] P1, P5, P7, P8, P9, P20, P23, P24

[105] ‘This Is How DIGITAL the BMW iFACTORY Is’ (BMW, 2022) <https://www.bmwgroup.com/en/news/general/2022/bmw-ifactory-digital.html> accessed 25 November 2024.

[106] P7

[107] ‘HoloLens 2 – Overview, Features, and Specs’ <https://www.microsoft.com/en-gb/hololens/hardware> accessed 25 November 2024.

[108] ‘Introducing Orion, Our First True Augmented Reality Glasses’ (Meta, 25 September 2024) <https://about.fb.com/news/2024/09/introducing-orion-our-first-true-augmented-reality-glasses/> accessed 26 November 2024.

[109] P9, P12, P21, P25

[110] ‘Surface EMG: An Exciting New Form of HCI That Considers Everyone’ (Meta, 25 September 2024) <https://www.meta.com/blog/quest/surface-emg-wristband-electromyography-human-computer-interaction-hci> accessed 25 November 2024.

[111] P14, P19, P25, P26

[112] ‘Virtual Land NFTs: The Complete Guide’ (OpenSea) <https://opensea.io/learn/nft/what-are-virtual-land-nfts> accessed 25 November 2024.


Image credit: alvarez