Virtuality Manifesto

Update (02/27/08): It’s been over a year since I wrote this article, and I just stumbled upon the “Virtuality Continuum”, which, described in a research paper authored in 1994, strikes me as being fairly close to what I imagined in the “Barthian Virtuality Gradient”. A brief analysis of the paper leads me to believe that the authors, anchored to “augmented reality” as their reference point, considered the axis of the VC to describe the ratio of “real” content to “virtual” content – the real world is 100% real content, while a typical computer game is nearly 100% virtual content, with an augmented reality game being perhaps half of each, and an MMO like World of Warcraft being very nearly all virtual content aside from that which evolved from the players themselves and could thus be considered real? The BVG, on the other hand, deals more with the placement of the window between real data and virtual data; while being less linear, I think that it captures an entirely different and much more interesting relationship.

I use the term virtuality to differentiate away from the term “virtual reality”, which implies a reality that is somehow fake and separate from “actual reality”. In all actuality, I envision a broader spectrum, with the goal of simply tweaking reality – meshing real, physical things with a virtual environment, or bringing virtual objects into a real environment, or simply playing with virtual objects in a virtual world. The gradient between what is real and what is virtual is what defines virtuality.

As of now, I currently use the above model, creatively named the “Barthian Virtuality Gradient”. For the most part, it conveys my feelings on the matter fairly well. Chances are, it’s been previously thought of by someone before me, or even more likely superseded, but I put it out into the open regardless.

Realities begin at the bottom – absolute reality. Using a wargame as an example, this would be actual war. Getting shot would suck. From here, we can begin to virtualize certain aspects, or essences, of things; tanks now fire “virtual” bullets, carried by infrared light. You’re still driving a real tank, but a certain portion now has a virtual essence (this is quite similar to the MILES laser tag system the US Army uses). At this point the gradient isn’t so simple. Working backwards from a true neural representation of a completely virtual environment, we first reach sensory simulations, by which we simply reproduce the sensory inputs we would normally get. This is where the more traditional technologies fit, such as head mounted displays, data gloves with haptic feedback, and direct body tracking – since we can’t yet interface directly to the nerves, we simply throw the inputs as close to the brain as we can and attempt to block out what we consider to be reality. Working back even more, we pull back the virtual reality and leave room for more actual reality, such as with CAVEs. I call this environmental virtuality, in which we modify the environment with more distinct windows into the virtual world. And thus, we find ourselves back at the tank, looking out of a virtual periscope and virtual windows. It may be a real tank, or simply a mock up of one, sitting inside of an essential object, interacting with its own virtual environment. The two often mingle at this point. With good reason.

Where environmental and essential technologies meet is where we need to focus, at least for now with our current level of technology. Too many people connotate virtual reality with sensor virtuality, which is both expensive and overly-complicated, which is why I try not to use the term.