The Differences Between Augmented Reality & Virtual Reality
From the humble telescope to advanced night vision optics, enhancing vision through technology has long proved a motivator for ingenuity. Thanks to the rise and subsequent miniaturization of computers, this effect is now primarily accomplished digitally.
Through the use of headsets, smartphone screens, and other viewing platforms, technology that alters one’s perception of the world is rapidly becoming mainstream. While there are myriad terms and divisions among this tech, two primary types stand out: augmented reality and virtual reality.
At its simplest, augmented reality (AR) enhances or alters one’s view of the actual world by overlaying computer-generated sensory data onto the actual environment. Because AR blends the real world with simulated elements, it is sometimes referred to as “mixed reality.” AR systems utilize sound, visuals, and even GPS elements to enhance the user’s relationship with their surroundings. Photo apps which overlay digital elements onto user images, such as Instagram and Snapchat, are a widespread, albeit simplistic, form of AR.
AR has several commercial applications. For example, some automobile manufacturing plants have issued AR eye-wear to workers so that they can keep tabs on other points of the assembly line without leaving their station. AR is also favored by industries that hinge on customer relations. Customer service associates in the healthcare industry can access data regarding a patient’s past visits mid-conversation without having to look away to their desktop.
AR isn’t limited to headsets. Several cars use windshields that can be overlaid with data like GPS information, upcoming detours, and other useful messages without obstructing the driver’s view of the road.
AR and Gaming
While AR has a number of commercial and industrial applications, AR games are perhaps the most widespread form of the technology. 2016’s Pokemon Go used the player’s smartphone camera and screen to overlay images of interactive monsters onto the actual environment. It also incorporated the phone’s GPS to curate virtual events and locations.
The more ambitious LyteShot uses both the player’s phone and AR eye wear to replicate first-person shooter style gameplay by overlaying player health, ammunition, and other game features onto the real world with the player taking direct action instead of controlling an avatar, thereby creating a hybrid somewhere between a console or PC videogame and laser-tag.
Unlike augmented reality, which used your view of the real world as a base, virtual reality (VR) creates a simulated environment entirely. Because of this, the most immersive and functional virtual reality systems are primarily limited to headsets that encompass the user’s entire field of view, making it somewhat more limited in terms of device application than augmented reality. That said, smartphones are also cable of creating simulations that fall under the VR umbrella.
While VR has typically been seen as the end-goal of gaming and leisure, its applications aren’t limited to entertainment. Real estate agents have implemented virtual tours of properties via the headsets, while the U.S. military uses the tech to provide an immersive, realistic, and safe way for specialists to practice I.E.D. disarmament. Likewise, virtual environments are used as a risk-free space for surgeons to practice techniques on simulated patients.
The 6 Degrees of Movement
Within the VR category there are two primary subdivisions: three DOF and six DOF, with DOF referring to “degrees of freedom.” Essentially, this refers to the number of spacial directions the user has access to in the virtual environment. More complex and realistic simulations incorporate six degrees. The first three are upward/downward, left/right, and forward/backward. These are commonly referred to as “translational movements” and require the VR device to track the user’s movements via an external camera.
The other three degrees of freedom encompass the “rotational movements,” or roll, pitch and yaw. While the terms are vague at first glance, they merely refer to the user’s field of view based on the position of their head as they tilt and turn their necks to view the virtual environment. Because this type of simulation is more simplistic it can be replicated on more limited hardware, such as a smartphone. Products like Google Daydream enhance the experience by enabling the user to slide their smartphone into a a headset viewer, making the experience hands-free.
Examples of VR platforms that incorporate all six DOF include the HTC Vive and Oculus Rift, which use fully encompassing goggle headsets and external cameras to create three-dimensional environments ripe for exploration and interaction.