Subscribe to our daily newsletter
Gamifying reality: How AR and VR will combine to transform experience
It's the dawn of a new age. AI, VR, and robotics are creating the future that science-fiction writers have dreamed about.
New and exciting realities are now just a few screens away. The wildest dreams of fiction writers are slowly seeping their way into our current day and age. Many people are familiar with virtual reality. You put on some kind of headset and you’re whisked into an all-encompassing world of sound and sight. VR’s closest cousin, augmented reality, comes in a few different forms – overlaid blocks of text and information, sometimes cartoonish images and games that let you interact with the world around you. Both of these types of tech have earned their namesake, but what about when you combine the two of them?
The border between these digital worlds is already beginning to break down. Mixed reality is the intersection of both AR and VR. Right now the biggest player in the mixed reality space is Microsoft, which is leading the way with its Hololens headset. In order to learn more about MR, we need to look a little deeper at both augmented and virtual realities.
A quick primer on different digital realities
So much is happening in the world of digital realities that it can become puzzling to try and draw a distinction between VR, AR, and MR. But each one of these realities can be quickly explained:
Virtual reality (VR) immerses a user in a digital environment like a video game.
Augmented reality (AR) places digital objects over a real-world view.
Mixed reality (MR) overlays and anchors virtual things in a real-world environment.
For VR, a computer generates the virtual environment that users then explore and interact with. Special hand controllers help to enhance and integrate the body into the entire virtual experience. An ideal virtual world will be completely cut off from the outside visual view, along with noise canceling headphones.
In an augmented reality, users interact with the real world while virtual content is added to the screen. Think of the quickly viral videogame Pokemon Go or some Snapchat features that add digital avatars to the world around you. Most of current AR is experienced through smartphones. There has been a mixed reaction to AR glasses, and no clear leader in that space yet – especially after Google’s failed Google Glass experiment.
You can also access virtual worlds through 360-degree video, which is also considered another form of VR. If, for example, you wear a Google Cardboard, you’ll be able to view any type of 360 video with your headset.
You must wear a specialized VR headset to experience any kind of virtual reality. Most headsets are connected to a computer or gaming console. Oculus Rift, HTC Vive and PlayStation VR are some of the more advanced and most popular devices in the space. Other affordable options include the Google Cardboard – these types of standalone VR headsets work in tandem with a headset.
Mixed reality on the scene
The most recent development in reality technologies has given us a few forms of mixed reality. One type of MR is the ability to not only overlay objects on the real world, but interact with them as well. This is a kind of advanced type of AR. Another interesting form of MR takes its cue from a completely immersed virtual environment where the real world is blocked out. At first, it sounds like just plain virtual reality. But in this instance, the virtual environment that you see is tethered to and overlaps the real world environment. Here’s an example of how this works.
Mixed reality fuses layered objects into the real world with an immersive digital world, allowing you to do things not possible in a strictly AR or VR digital environment. The cutting-edge paradigm shift into MR has been made possible with the Microsoft Hololens - a headset that as the name would suggest, allows its users to overlay holograms from virtual worlds on top of regular old reality (take a look here — they look like space-age Oakley sunglasses). Essentially, it creates the feeling of being present within a virtual environment.
This type of intersection between the real and virtual gives us an entirely new space that we can interact and innovate inside of. We’ll be unearthing a whole new expanse of possibilities as the technology grows.
New mediums of experience
If we’re to take a page from Marshall McLuhan, mid 20th-century media theorist, our new mediums of technology will begin to radically alter our perceptions of ourselves and reality regardless of the content. A famous McLuhan quote puts it simply:
“We become what we behold. We shape our tools and then our tools shape us.”
Virtual and mixed realities will be no different and will completely change our way of doing things and viewing our world. Look no further than actually trying to explain and differentiate between these realities. It will become more difficult throughout the years as these once novel technologies will be completely integrated into our lives. No one thinks much about having a supercomputer in their pocket anymore. It’s become a normal mode of existence. AR, VR and the junction point of mixed reality is the next logical step.
Reality is almost becoming gamified. One day, surgeons should be able to overlay x-ray or ultrasound images over a patient while they operate on them. Designers and artists will be able to collaborate with another from miles away and project an imagined idea into a real-life space. Drones traversing the sky will instantly relay quantifiable information about the world while they fly. There’s no end in sight to what’s possible.
Different perspectives and another person’s point of view will seamlessly become a visual activity to participate in. There is no limit to the medium.
A future of possibility
Inventors and artists are the ones who tend to lead the way when it comes to future technology. Our ability to transform the world and our lives is limited only to our imagination. With mixed reality, we’re given a blank canvas over the rich and vast natural environment. It’s almost as if the internet has found a new conduit, or rather a physical manifestation of itself, and divorced itself from the computer screen. This very well could be the beginning of a seismic shift of our shared technological realities.
A Mercury-bound spacecraft's noisy flyby of our home planet.
- There is no sound in space, but if there was, this is what it might sound like passing by Earth.
- A spacecraft bound for Mercury recorded data while swinging around our planet, and that data was converted into sound.
- Yes, in space no one can hear you scream, but this is still some chill stuff.
First off, let's be clear what we mean by "hear" here. (Here, here!)
Sound, as we know it, requires air. What our ears capture is actually oscillating waves of fluctuating air pressure. Cilia, fibers in our ears, respond to these fluctuations by firing off corresponding clusters of tones at different pitches to our brains. This is what we perceive as sound.
All of which is to say, sound requires air, and space is notoriously void of that. So, in terms of human-perceivable sound, it's silent out there. Nonetheless, there can be cyclical events in space — such as oscillating values in streams of captured data — that can be mapped to pitches, and thus made audible.
Image source: European Space Agency
The European Space Agency's BepiColombo spacecraft took off from Kourou, French Guyana on October 20, 2019, on its way to Mercury. To reduce its speed for the proper trajectory to Mercury, BepiColombo executed a "gravity-assist flyby," slinging itself around the Earth before leaving home. Over the course of its 34-minute flyby, its two data recorders captured five data sets that Italy's National Institute for Astrophysics (INAF) enhanced and converted into sound waves.
Into and out of Earth's shadow
In April, BepiColombo began its closest approach to Earth, ranging from 256,393 kilometers (159,315 miles) to 129,488 kilometers (80,460 miles) away. The audio above starts as BepiColombo begins to sneak into the Earth's shadow facing away from the sun.
The data was captured by BepiColombo's Italian Spring Accelerometer (ISA) instrument. Says Carmelo Magnafico of the ISA team, "When the spacecraft enters the shadow and the force of the Sun disappears, we can hear a slight vibration. The solar panels, previously flexed by the Sun, then find a new balance. Upon exiting the shadow, we can hear the effect again."
In addition to making for some cool sounds, the phenomenon allowed the ISA team to confirm just how sensitive their instrument is. "This is an extraordinary situation," says Carmelo. "Since we started the cruise, we have only been in direct sunshine, so we did not have the possibility to check effectively whether our instrument is measuring the variations of the force of the sunlight."
When the craft arrives at Mercury, the ISA will be tasked with studying the planets gravity.
The second clip is derived from data captured by BepiColombo's MPO-MAG magnetometer, AKA MERMAG, as the craft traveled through Earth's magnetosphere, the area surrounding the planet that's determined by the its magnetic field.
BepiColombo eventually entered the hellish mangentosheath, the region battered by cosmic plasma from the sun before the craft passed into the relatively peaceful magentopause that marks the transition between the magnetosphere and Earth's own magnetic field.
MERMAG will map Mercury's magnetosphere, as well as the magnetic state of the planet's interior. As a secondary objective, it will assess the interaction of the solar wind, Mercury's magnetic field, and the planet, analyzing the dynamics of the magnetosphere and its interaction with Mercury.
Recording session over, BepiColombo is now slipping through space silently with its arrival at Mercury planned for 2025.
Research suggests that aging affects a brain circuit critical for learning and decision-making.
As people age, they often lose their motivation to learn new things or engage in everyday activities. In a study of mice, MIT neuroscientists have now identified a brain circuit that is critical for maintaining this kind of motivation.
Researchers develop the first objective tool for assessing the onset of cognitive decline through the measurement of white spots in the brain.
- MRI brain scans may show white spots that scientists believe are linked to cognitive decline.
- Experts have had no objective means of counting and measuring these lesions.
- A new tool counts white spots and also cleverly measures their volumes.
White spots and educated guesses<p>The white spots, or "hyperintensities," are brain lesions—fluid-filled holes in the brain believed to have been left behind by the breaking down of blood vessels that had previously provided nourishment to brain cells.</p><p>Prior to the new research, the quantity of white spots was assessed using an imprecise three-point scale indicating ascending likelihoods of dementia: A minimal number of spots was considered as level 1, a medium number of spots level 2, and a great number of them level 3.</p>
How the new measurements were derived<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDYwMTc1OS9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzNDQ1ODExNX0.vqhQJSvL99KjOe24TOs4E8R7c6-pprbXYSrGcIqbVps/img.jpg?width=980" id="c64d9" class="rm-shortcode" data-rm-shortcode-id="002d9b8ef47b5a86c3a387ad2cd90629" data-rm-shortcode-name="rebelmouse-image" />
Credit: sfam_photo/Shutterstock<p>The team of researchers from NYU's Langone's <a href="https://med.nyu.edu/departments-institutes/neurology/divisions-centers/center-cognitive-neurology" target="_blank">Center for Cognitive Neurology</a> and <a href="https://med.nyu.edu/departments-institutes/neurology/divisions-centers/center-cognitive-neurology/alzheimers-disease-research-center" target="_blank">Alzheimer's Disease Research Center</a> were led by <a href="https://med.nyu.edu/faculty/jingyun-chen" target="_blank">Jingyun "Josh" Chen</a>. They analyzed 72 MRI scans from a national database of older people taken as part of the <a href="http://adni.loni.usc.edu" target="_blank">Alzheimer's Disease Neuroimaging Initiative</a> (ADNI). The scans were mostly of white people over age 70, and there were a roughly equivalent number of men and women. Some had normal brain function, some were presenting moderate cognitive decline, and some had severe dementia.</p><p>Without knowing each individual's diagnosis, the researchers analyzed the white spots in their scans. While the team counted each scan's lesions, the innovation they introduced was the production of a 3D measurement for each lesion's fluid volume. The measurement was derived by measuring a lesion's distance from opposite sides of the brain.</p><p>Measurements of 0 milliliters (mL) were assessed for areas without white spots, with other white spots coming up as containing 60 mL of fluid. Chen's team predicted that volumes over 100 mL could signify severe dementia.</p><p>"Amounts of white matter lesions above the normal range should serve as an early warning sign for patients and physicians," Chen told <a href="https://nyulangone.org/news/white-matter-lesion-mapping-tool-identifies-early-signs-dementia" target="_blank">NYU Langone Health NewsHub</a>.</p><p>When the team compared the likely diagnoses derived from their calculations against the individuals' medical records, they found that their predictions were correct about 7 out of 10 times.</p><p>The researchers compiled their formulas into an online tool that's available to physicians for free via <a href="https://github.com/jingyunc/wmhs" target="_blank" rel="noopener noreferrer">GitHub</a>. The researchers plan to further refine and test it using an additional 1,495 brain scans representing a more diverse group of individuals from the ADNI database.</p>