Design by: Swati Sheenum
“Everything’s science fiction until someone makes it science fact.”
It’s crazy how fast the world is growing.
Nearly five decades ago, American television writers Chuck Menville and Len Janson envisioned something stupendously innovative- something that would plant the seeds for the creation of immersive virtual reality technologies.
To borrow some Avengers lingo, quite simply, ‘there was an idea.’ To create a room that could be something much more. A place which could be simulated into literally anything. The writers envisioned the creation of entirely new and interactive environments all within the space of this room. A person in this setting wouldn’t be able to distinguish between the real and the holographic – every touch would feel real, and every sight would be deceiving.
Thus, Star Trek gave us the Holodeck, and the world loved what it saw.
The Holodeck has been seen multiple times since then, with the Star Trek show-runners using it for a variety of purposes. Some simulations recreate undertakings in science, logistics and law. Other simulations have more of a leisurely purpose – for example, allowing Captain Jean Luc-Picard to take on the persona of one of his boyhood heroes. Whatever the purpose of use, all the simulations bore a striking resemblance to reality with interactive characters and in-depth storylines.
Is this the real life?
Technology, of course, has grown by leaps and bounds in the decades since that time. Fascinatingly enough, shows like Star Trek probably played a pivotal role in these developments, and pretty soon, we might even have a real-life Holodeck!
Though the idea has been around for a good amount of time, the term ‘Virtual Reality’ was coined only in 1987, and by 1991, games and arcade machines that delivered immersive experiences using VR goggles were available to the general public. In 2010, the prototype for the first Oculus Rift headset was developed, and today, VR is taking the world by storm.
An advertisement from the 1950s describing the ‘Sensorama’
The basic idea behind VR headsets isn’t all that complicated. The user mounts the VR headset on his head, essentially wearing a ‘screen’ that tracks head movements. Continuous video data is fed to the headset, which is then focused and reshaped by lenses placed between their eyes and the screen to create a sort of stereoscopic 3D projection.
Head tracking systems are used to ensure that the field of view changes as the user moves his head – these can range from magnetometers to microscopic electromechanical gyroscopes. The use of headphones with ‘multi-dimensional audio’ (seen in multiple YouTube videos, it only involves cleverly directing the sound to only one ear of the headphone to simulate directional sound) enhances the whole experience, and soon enough the user is hooked.
Most VR companies even use motion detection technologies. These technologies show a lot of variety but here’s an example:
A small number of ‘base stations’ are placed at the corners of a room, and these continuously scan the room with lasers. On receiving a signal from the headset or the controller in the user’s hand, they make precise calculations related to position by measuring the time taken by the signal to return. In this way, they can track the position of a user’s head as well as hands and change the scenery accordingly. The controllers used by the user can also be put to better use and are often programmed with synchronised vibrations to further add to the experience.
Finally, an infrared sensor inside the headset monitors eye movement to deduce exactly where you are looking. Imagine scanning scenery only by moving your eyeballs. Well, you can do this with VR, which is indeed exciting.
However, some researchers are still not satisfied. After all, Star Trek’s revolutionary Holodeck is still a long way ahead. Having engaging conversations with holographic characters within the depths of a simulated world requires significant developments in Artificial Intelligence, and we’re not quite there yet. The sensation of touch isn’t all that real either. There is no way a pair of controllers can simulate the sensory perception of the human fingers on actual contact.
So they continue working.
CAVE2 (the ‘AVE’ stands for ‘Automatic Virtual Environment’) is a ‘hybrid reality’ system designed by the Electronic Visualisation Laboratory (EVL) at the University of Illinois at Chicago. The lab is a little over 40 years old and is one of the oldest labs working on computer graphics and interaction in the country.
Theirs is an interesting setup – CAVE2 is essentially a cylindrical room peppered with thousands of liquid crystal displays. The LCD’s are divided between 18 columns, with four panels in each column, and each column is driven by a separate computer in a cluster.
Speakers are similarly spread throughout the room in such a way that sound can seem to be coming from any part of the room. They’ve also replaced the traditionally bulky VR headset with a lightweight pair of sunglasses that have tiny markers attached to them which are then detected by cameras placed all over the room that continuously send infrared light across. Images then shift according to detected movements and voila! – the world shifts to meet your perspective.
This, in itself, is awe-inspiring. Yet, probably the most mind-boggling of all is an idea that aims for haptic feedback – a kind of tactile feedback where one gets a sensation of force even when there’s nothing really there.
The idea aims to use sound – more specifically, ultrasonic sound – sounds that are too high pitched to hear. It turns out, if these waves are made strong enough, you just might feel the vibrations they cause in the air. These vibrations can then be manipulated in the right way to make air feel like a solid object!
Researchers from the University of Tokyo in 2008 used transducers that emitted sound to produce ultrasonic waves that would ultimately interact in such a way that a certain point would feel like a solid object. Of course, one could break through this ‘solid’ region by applying sufficient pressure, but the idea is promising. Scientists have now been able to simulate a wide variety of surfaces in this way, and the possibilities seem limitless.
The applications of VR are innumerable. The researchers at CAVE2 have designed simulations that take you through the depths of the neural connections in the brain. They transport you to the rugged terrain of Mars, and even actual coral reefs based on data collected by deep-sea divers.
As it stands, VR may seem a bit daunting, but the impact it could have in enhancing life is worth noting. Imagine getting to explore the ruins of the Colosseum in your history class as a sort of guided tour, or watching recreations of chemical reactions in a Chemistry class that just might be going over your head.
Captain Jean Luc-Picard once said, “There is a way out of every box, a solution to every puzzle; it’s just a matter of finding it.” We as humans have been remarkably adept at doing just that, and if we keep up our efforts, the Holodeck isn’t really that far away.
Until then, of course, the work will continue. After all, it is the lot of man to strive no matter how content he is.
Back to the Future is a series that looks at pop culture’s futuristic predictions about science and technology, and tries to break down the real-life facts behind them. Comments and suggestions are always welcome, you can send them to us at [email protected]
Series by: Amrita Mahesh