A project done in DH2413 Advanced Graphics and Interaction in KTH
Dragon Trainer is a co-op VR experience where you and your friend cooperate together as a dragon and its rider to explore a vast landscape, fly through rings, and breathe fire at sheep. Mount our proprietary controller platform and steer the experience together. The dragon controls speed by flapping its wings (your arms) and shoots fire through its mouth. The rider controls pitch with its controller, and together you control the roll by tilting left and right on the platform. By steering and cooperating together, we provide an experience like no other, immersing the players in once-in-a-lifetime fun.
We had many goals for this project. We wanted to sell (in some sense) the illusion of flying by either having the players feet dangle or by having the platform they were on tilt (we chose the latter since it was better for both players and since mounting something tall on a moving board that someone should sit on while wearing a VR headset could be quite dangerous).
What motivated us to create this experience was partly because it sounded silly but also because it was unique and could create a fun and immersive co-op experience with interesting controls. What also motivated us was that we got to try to develop a VR experience, some of us had experience developing in VR before and some had less to none. We all thought it sounded fun and were all motivated to try different technologies with it, e.g. procedural animation, procedural generation, fire shaders, boids and so much more.
We decided to use the VIVE headset because of the trackers given and the sensors we could use in the room. This allowed for tracking of the trackers without having them in the player's view. This allowed for tracking of the wings and flapping with the user looking forward, for instance.
The physical part of the project was partly inspired by the Birdly project (see Related Work below), but also from our own imagination. In the last project, we had not experimented too much with the interaction aspect of the project, therefore, we wanted to go wild. The players using Dragon Trainer use their whole bodies to interact with the gameplay, tilting, flapping, steering, and opening/closing their mouths makes it fun and sometimes a little bit challenging. The tilting board is not only to make the experience challenging, but we also hypothesized that actually tilting while tilting in VR could help with the cybersickness that is easily caused when your senses do not match what is happening in virtual reality.
The game world is infinite as every part is procedurally generated. This means new areas are created whenever the player moves, and you will never reach the edge as everything is math. We decided to use procedural terrain generation because a central part of the experience is freedom, and reaching the end of the world breaks that feeling. Since hand-crafting an infinite world is of course not possible, procedural generation was the clear choice. The terrain itself is generated through sampling Perlin Noise, and everything else is added randomly afterwards based on the features of the ground. To hide the trick of creating terrain as the player moves, we also added distance fog, which blends the world with the color of the sky based on how far away something is.
The fire effect was created with the VFX graph in Unity. At first, it was developed based on a couple of papers, with compute shaders and a lot of code, but that quickly got too complicated for the effect we were after and the skillset we had. We therefore created it using Unity's VFX and shader graph tools. This turned out to create a nice effect and did not take too much time to create which gave us time to work on more things. Shader graphs were also used when creating the dissolve effect that gets triggered when sheep are hit by the fire.
Sheep herds followed a Boid system to recreate realistic animal flocking behavior when being hunted, regrouping, or avoiding water. It was important to optimize the algorithm to work under procedural terrain generation circumstances, where fairly large sheep herds are also spawned in and out of existence.
Fig. Setup Diagram
Fig. Board Setup
Hover Broom is a AGI project from 2016 that was one of our first inspiration points. We wanted to create a similar experience but with having peoples feet dangle to immerse them even more in the experience. In the end, we steered away from this idea but it was still a good inspiration point in the beginning.
We looked into cyber sickness and also discussed with our teachers in the course to find a solution for the cyber sickness that could be caused. We looked into papers the e.g. used occlusion of the peripheral when doing large movements in VR in order to reduce cybersickness. Even though we did not implement this solution, papers like this helped us think of good solutions while developing the project.
When we were ideating for the project we had many ideas, "What if we hang someone from the ceiling? What if we have their feet dangling on high bar chairs? Can we make a platform tilt by laying a board on an inflatable mattress?". The last question is almost the final product and we got some of the inspiration from the Birdly project that immerses their players in a flying VR experience on a moving platform.
When setting up the Boid forces acting on the sheep, an implementation guide was helpful. In itself, this guide is based on the standard Boids algorithm by Reynolds, the writer of the seminal Boids paper: "Flocks, herds and schools: A distributed behavioral model".
One big lesson that we learned during this project was that there is a lot of tweaking and testing that goes into making the VR experience fun but also not cybersickness-inducing. We put a lot of time into testing and tweaking the controls, making them realistic, not too fast but also smooth and fun. We learned that cyber sickness is often going to be a problem but the experience impacts it heavily. There is also big differences depending on the users experience with virtual reality in how sick they become.
Another lesson we learned during the project was that there is a lot of work syncing and managing network objects. We had problems with high fps, weird connection problems and a lot of fun bugs. When we had new functionality, such as boids behaviour on sheep it was not done easily since we had to sync it through three instances of the project (dragon, rider and spectator).
Furthermore, a challenge we had was to work in Unity together as a team. Even though we are used to using Git, there were some problems working in the same scene in Unity. It worked out in the end, however there was some manual labour that had to be done when we merged branches. If we were to do it again, we would probably instanciate every gameobject with scripts instead of adding them directly to the scene. We would also use the LFS (Large File Storage) to handle the larger assets.
Procedural Animation: Worked with Unity's animation framework and achieved realistic procedural wing animations for the dragon.
Multiplayer System: Worked with Unity's networking system and achieved synchronized procedural animation and terrain generation.
VR Interaction System: Worked with Unity and SteamVR and achieved using VIVE trackers for real-time dragon flight control and real-time procedural wing animation mapping.
Dragon Flight Control: Worked with a custom physics based flight control system that allowed for realistic dragon flight.
I worked on a fire system using shaders. I first experimented with using compute shaders but ended up using shader graph and VFX graph to create the fire effect. I also created the OpenCV system to recognize the players mouth opening and closing and connected it to unity via UDP. Other than that i created sound effects, UI as well as fire collision with sheep and the dissolve shader that was applied when sheep got hit by fire. Other than the digital parts of the project I also put a lot of time experimenting and building on the physical parts of the project, making it comfortable and also easy to lay/sit and steer the platform.
I first worked mostly on the Unity project setup, getting VR working in the project and general structure. However, my main responsibilities throughout the project were the procedural world generation and flying physics. The world generation work mainly included chunk loading and unloading, terrain shape through Perlin Noise, and general effect such as distance fog. For the dragon movement I integrated an airplane controller and shaped it for realistic but controllable dragon movement.