After spending 9 months designing the virtual environment of the Shayd VR Installation, I want to expand and start doing a series of smaller “vignettes” that show off similar narratively-charged spaces designed specifically for virtual reality hardware. The most unique thing about Shayd is of course the VR hardware, and as a player (or user, or viewer, or whatever noun applies here), you can revel in the sensory feelings of presence within the virtual environment. The second you put on the head-mounted display, the peripheral vision and head tracking immediately provide the illusion of being entirely present on an alien planet. And the great thing about building Shayd in Unity Pro is that the graphics are pretty nice! It far superior than most other VR experiences to date, because few VR setups use an industry standard game engine. Once real game engines such as Unity, Source, Unreal, or CryENGINE become the standard for VR, the quality of the content is going to escalate. I hope to continue this trend with my own work
In the first scene, players find themselves in the Cave:
The Cave is the ideal starting point of the Shayd experience, where the user crosses over from the physical world to the virtual one almost seamlessly. Both the physical cave installation and the virtual cave are spatially identical, so the transition is smooth and it really immerses you in the mood, atmosphere, and narrative of the story. You really feel like you are a space traveller who is marooned on an alien planet.
Atmosphere and Mood in Shayd
I like to think that, at its most basic foundation, good level design provokes both a distinct atmosphere (physical sensation) and mood (emotional sensation). For instance the atmosphere of the Cave in Shayd evokes physical feelings of cold and moist, and the mood evokes emotional feelings of magic and awe. The mood is especially apparent with the opening soundtrack by our composer Jeremy Tisser:
Mood and atmosphere are always independent of each other, too. Jeremy could have easily written a survival horror soundtrack that evoked a mood of pounding heartbeats and chilling anticipation, while the atmosphere remained exactly the same, with water drips and cold wind. I could have arranged brown and red lighting to promote an atmosphere of warmth, fire, and dryness, while the mood still continues to embody a sense of magical discovery. Sound effects, music, lighting, textures, geometry, and countless other aspects work together to mold the perfect mood and the perfect atmosphere that sets the stage for the narrative. And on top of all that, if a space can provide actual narrative exposition simply through observation of the environment, then the mise-en-scene is really doing its job. In Shayd we placed prehistoric cave paintings on the walls, that – if closely examined – revealed much of the back story of the planet and illustrated the alien species you were about to encounter. Because Shayd was a fantasy experience, it was fun to examine the paintings – yet had it been a psychological thriller, those cave paintings would be more of a foreboding omen than childlike drawings, probably along the lines of Prometheus. Just for kicks, here’s the difference:
How does a Head-Mounted Display enhance Level Design?
The atmosphere and mood of the Cave space was especially enhanced by the HMD. The screenshots above (top) of the virtual Cave environment can’t really do justice to the subjective experience of the real Shayd installation. While those screenshots represent what a first-person POV looks like using a conventional computer screen, the actual visual output into the head-mounted display has a barrel warping technique applied. Due to the nature of the optics within the HMD, namely side-by-side stereoscopy, our lead engineer Chao Huang had to warp the side-by-side image to match the aspherical lenses, thus enhancing the the illusion of seeing the world through human eyeballs. Had pre-warping of the image not been utilized, it would have appeared unnaturally skewed. Pictured below (top-left) is what barrel warping looks like on a monitor:
The picture on the bottom-right is of the interior of the HMD, so you can get an idea of what the lenses feel like. This pre-warping technique was also used very recently by John Carmack (Id Software) to make the upcoming Doom 3: BFG Edition compatible with head-mounted displays, namely the new Oculus RIFT that has recently been developed by Palmer Luckey. Palmer also developed the PR4 head-mount we used on Shayd – an early predecessor to the RIFT – so the I expect the process to be very similar. Now that communities of developers are working hard to make the RIFT compatible with other games besides Doom 3, I expect that in the very near future all first-person shooters are going to have barrel-warping options in their graphics settings. I blabbed a lot about this already in my last post on first-person shooters, but the community at the Meant to Be Seen forums have already pre-warped Mirror’s Edge and Skyrim, and I’m sure there are many more to follow! This makes VR no longer just an installation experience, but soon to be compatible with commercial videogames as well.
Narrative Architecture and the Medium of Virtual Reality
Virtual Reality is a medium that laser-focuses on spatial storytelling, or mise-en-scene, even more so than films and videogames. Henry Jenkins calls it narrative architecture, which he outlines in his essay Game Design as Narrative Architecture:
“[Narrative architecture] creates the preconditions for an immersive narrative experience in at least one of four ways: spatial stories can evoke pre-existing narrative associations; they can provide a staging ground where narrative events are enacted; they may embed narrative information within their mise-en-scene; or they provide resources for emergent narratives.”
Although Jenkins is applying this concept to videogames, I don’t think game designers really focus on narrative architecture on the whole as much as they do gameplay. And filmmakers also have their own priorities. Generally speaking, movies are focused on camera movements within a space more than the space itself, and videogame levels tend to serve more as a backdrop that gives game mechanics some kind of context. What I mean to say is that, while spaces in films and games can be immaculate and compelling, they are not the primary focus of the respective mediums they serve.
Production designers for film and level designers for games both create incredible spaces that craft the story and immerse the viewer / player, but at the end of the day the universes they bring to life are forced to serve something else, such as a camera or a game controller. I’m really excited to be working in VR, because VR is a medium that serves nothing else, where the primary force of the narrative experience is the space itself. All in all, I think the concept of narrative architecture is much more useful to the field of virtual reality than it is to games or films, simply because VR is a medium that is inherently focused on space.
Narratively Charged Vignettes
I’m going to start experimenting with the concept narrative architecture in VR by designing a series of maps in the Unity Engine or Source Engine that exemplify some really cool aspect of the medium. Initially I want to pull inspiration from moments in film that I found to be particularly charged with mood and atmosphere in a spatial sense. One that I really love is the New York Bar from Lost in Translation, and the view of Tokyo it provides in the context of that film:
Another big inspiration for me is the scene in Melancholia in which the planet Melancholia rises above the horizon of the ocean, silently sucking oxygen from the Earth’s atmosphere in its wake:
Music of course plays a vital role in these scenes, and are just as important as the visuals. Girls by Death in Vegas is a huge part of the mood in Lost in Translation:
Just as Wagner’s Tristan und Isolde: III. Prelude is a defining melody in Melancholia:
One of my favorite spaces of all time is the Sarang space station for the film Moon. The VFX supervisor, Gavin Rothery, outlines the evolution of the set in his blog post Designing Sarang: Robotic Space-House of the Future. Its an absolutely gorgeous environment, first developed in Maya and then brought to life for the film:
However, whether out of pure exhaustion or disinterest, Gavin says that he wasn’t that upset when his beautiful space station was reduced to a heap of wood and metal at the end of the shoot. That just goes to show the amount of respect that the film medium has for spatial storytelling. I could be wrong (and probably am), but my assumption is that production designers are accustomed to playing a supporting role as opposed to being the primary creative force, and so naturally when the film shoot is over there is little reason to hold on to the set. But just imagine the kind of VR experience that could be made from it! Here’s a CG fly through of the set in Maya:
It would be very possible to import the exact same Maya assets into a game engine such as Unity Pro or Source, and turn it into a VR experience. Like Shayd, where you play an intergalactic space traveller who lands on an alien planet, I could easily see a “MOON: VR” where the user plays as Austranaut Sam Bell coming to the end of his three-year contract on the Sarang base. It would be quite a different telling of the story, to be sure, but different mediums excel in different ways, and a virtual reality experience of this nature would be absolutely surreal.