My name is Mathieu Gollain,
and one of my tasks at COMMUDE is to look into current technologies and techniques involved into virtual reality (VR), to see how it could serve our customers and to prototype the ideas of the creative minds inside the company.
Having a system and network engineering background I’m not a game developer at a professional level, however having used a Oculus Rift DK2, following closely the world of game development since 2002 and sometimes participating in it, I have some knowledge about the tools involved in the process, so this was an opportunity for both the company and me to explore the topic a bit further.
・History of virtual reality “VR”
・VR’s “sense of presence” experience
・Recommended equipment necessary for experiencing VR
・COMMUDE’s VR project
・About the equipment I tried this time
History of virtual reality “VR”
VR has been for a long time a dream that was constantly pushed back because the hardware was not ready for an experience that was enjoyable enough… until somebody proved that the time was right, thanks to the miniaturization of LCD screens made possible by smartphone research.
This somebody was Palmer Lucky, who founded Oculus following a successful crowdfunding campaign in 2012.
Oculus was later acquired by Facebook for $2 billion.
The first public model from Oculus named Rift DK1 (for Development Kit 1) was a proof that the general public wouldn’t have to wait long until it could experience VR, but it still wasn’t very satisfying: the screen had a low resolution, the latency was too high, the head position wasn’t tracked in space, and there were no dedicated controllers.
The second model, the DK2, added head position tracking in space, had better screen and latency, but was still not consumer-ready yet.
When the DKs were released to enthusiasts and game developers, everybody had to learn how to do VR right, what mistakes to avoid, what techniques to implement, … so for a long time, when developers were still figuring things out, players felt sick very quickly inside VR.
This “motion sickness” happens when there is a difference between how both the brain and the inner ear experience reality and how the eyes view VR content: if the eyes see that the world is moving but the brain doesn’t feel that way, this produces a phenomenon akin to seasickness, but in reverse (with seasickness the boat moves, so the inner ear feels it, but the eyes see that the inside of the boat is static, which cause a perception dissonance).
Even a small latency can induce motion sickness, because the brain needs everything to be correctly synchronized, so performance is way more important that image quality.
VR’s “sense of presence” experience
The main goal of VR is called “Presence”, which is what happens when you feel like you really are in the virtual world and temporarily forget that you’re still physically in the real world.
Presence was nearly impossible to reach with the DK1, rare but possible with the DK2, and the latest models (Oculus Rift Commercial Release, HTC Vive, Playstation VR, …) can quickly induce Presence with correctly developed games and experiences.
For now we’ll focus on VR on computers, but other hardware (mobile phones and consoles) could follow.
Recommended equipment necessary for experiencing VR
VR, for both development and play, requires recent hardware which could be a costly investment for home users (107,784円 for the HTC Vive with two tracked controllers, $599 for the Oculus Rift with a remote, and a VR-ready computer which could cost up to 150,000円).
For now, Apple computers are not supported (Apple is technologically too far behind according to Oculus) and very few PC laptops (if any) could be up to the task.
Latest versions of Windows (7 and higher) are therefore currently the primary targets.
Recent advances in graphics cards technology (both in computing power and drivers) allow to enjoy higher-quality content in better conditions, particularly with the latest generations (for NVIDIA, this is currently the GeForce GTX Generation 10).
The previous generation (like the GTX 970) still offer a satisfying experience at lower prices, but if you’re planning to upgrade the latest generation should be prioritized.
You should first look at the recommended hardware for the games you’re planning to play or the experiences you’re interested in: for instance the free Star Wars “Trials on Tatooine” experience recommends a GeForce GTX TITAN X (but it runs very well on a GTX 1080).
Depending on the model and the game or experience, VR can be lived in two different modes: seated or in room’s scale (where you can walk in the virtual space, in a pre-configured restricted space).
COMMUDE’s VR project
To begin our research, we wanted to start with a simple task: recreate one level of our office to be able to view it in VR.We had two ways of doing it: complete 3D reconstitution and photogrammetry.
Our attempts at photogrammetry produced meshes that were not accurate enough and would have to be heavily altered to be usable, and the result would have been way too restrictive for us to add interaction with the world. But before really starting, we had to chose both the hardware and the software we’d use.
About the equipment I tried this time
We chose the HTC Vive because it allows for the two use cases (the Oculus Rift only targets seated gaming), already supports tracked controllers (which is still not the case at the moment with the Oculus Rift), has quickly gained a huge following in gaming and corporate use, has started opening its tracking technology to other uses, and more importantly was available immediately (we bought it in a nearby store).
Which is not to say that the main PC competitors such as the Oculus Rift are seriously lacking in comparison, but as the saying goes “who can do more, can do less”.
Although many people think a term with “game” in it is something trivial or can only be used for trivial goals, current game engines are flexible enough to also be used in architecture, design, motion pictures, various kind of art, live performances, prototyping, … and more importantly for us, VR experiences.
Unity and Unreal Engine have become the primary choices among both amateurs and professionals for a few years now, with other engines for edge cases or complete in-house development.
Our choice ended up being Unreal Engine 4, for its cost (free to use, lots of free assets from its developer and on the Internet), its ease of use and learning (lots of official and unofficial video tutorials, possibility to prototype quickly using graphical programming), its popularity, the extensive list of supported platforms, the ability to access its full C++ source code for free and modify it if needed.
Middlewares and complementary software:
Game engine editors don’t integrate everything needed to create a product.
For instance there is no 3D modeling tool, no texture creation tool and no development environment.
Our choice for 3D modeling was Blender, a free and open-source tool that allows for very advanced workflows. For a very long time it had the reputation of being hard to learn, but releases in recent years have become more user- friendly.
Another product we’re evaluating is the increasingly-used software Substance Designer, which lets us produce textures based on combinations of (more or less) simple instructions through a procedural workflow.
Materials don’t only contain one texture: for instance a simple white painted wall could have a flat “whitish” color as main texture, but also a “noisy” normal map (relief indicator) so that the game engine could render small details on it to give it a more realistic look through shader calculations inside the graphics computing chipset.
More complex surfaces may have way more components: besides the base texture and the normal map, in the typical PBR (Physically-Based Rendering) workflow of Unreal Engine 4 (and Unity 5) we can use a layer to indicate which parts are metallic, another to indicate the reflectivity, another for occluded sections, … and Substance Designer can speed up this process by a huge margin. Although we should keep in mind that performance is key for VR, so we have to limit the rendering complexity.
For now we don’t need a development environment, but for Windows platforms Unreal Engine 4 is designed to integrate with Visual Studio from Microsoft (the currently supported version is the 2015 one).
To end up with a computer we could use for a long time we knew we would need a powerful graphics card (GeForce GTX 1080), a speedy processor for various intensive tasks such as light computation (Intel Core i7 6700K), and a lot of memory to be able to use multiple memory-hungry applications in parallel (32GB DDR4).
For the storage, we bought a 250GB SSD for the OS and the software, and a 3TB HDD to store the bigger or less- used data.
To support such hardware, while minimizing heat and noise and keep the case neat, we invested in a high-quality modular power supply, an all-in-one liquid cooling unit and a computer case allowing for cables to pass behind the motherboard plane. If needed we can still add fans later.
To save time and money, we built the computer ourselves.
The resulting machine cost less than what a major electronic reseller was offering us, and we could chose higher-quality components.