Voyage: a collaborative mobile VR experience

Project Voyage is to explore collaborative mobile VR in a classroom setting. We networked 20 Google Pixels together and brought the whole class on a virtual field trip. It is a project that I pitched with my friends at the Entertainment Technology Center. 

Since not many people have done this before, there’s a lot of unknowns going into this project. Through our experience we wanted to identify and solve as many problems as possible associated with putting multiple people into VR at the same time, specifically in a classroom setting. To do this we have been working with the students and teachers in the Cornell School to create an immersive experience with a group activity around the topic deciduous forest biome. The school kindly provided us with their devices and class time and we were able to playtests our experience throughout the semester.

The students and the teacher can see and interact with each other in the virtual world. The teacher have full control of the experience, and can monitor what the students are doing. 

To know more about our project please go to our website:  http://www.etc.cmu.edu/projects/voyage/

 

My Role: 3D Generalist, Technical Artist

  • Made most of the assets, including rigged animals and the environment

  • Two seasons ( spring and autumn), different textures
  • 3D software used: Maya (modeling, lightmap baking, rigging), Substance Painter & Photoshop(texturing)

  • Platform: Unity, Google Pixels, Google Daydreams

  • Solving performance issues related to mobile VR

Personal Challenges

Maya – Substance Painter – Unity Art Pipeline

Before this project, I was always wondering what is the best practice of the art pipeline between Maya and Unity. This time I have the luck to go through the pipeline and learnt a lot of dos and don’ts in this process. 

I also learnt Substance Painter and PBR workflow to speed up the workflow, though PBR was not used in this project.

A lot of mistakes were made. For example, I was trying Maya’s reference system so that I can assemble the models together while still had the power to change them later. However, I encountered loss of material several times, and made things too complicated for a one-person workflow. Also, when import into Unity, the reference lost and the models are separated. Because of this we could only change the model of the scene as a whole, but we had a lot of scripts on each objects, which would lose every time we change the model. We figured out some ways to solve this problem (e.g. import a group of pieces as a prefab and replace the ones in the big assembly), however, it was really a pain to keep the Maya scene and the Unity Scene consistent. 

The better workflow is to use Unity’s prefab system and assemble things in Unity.

Work with Hardware Limitation – Low poly

We were working with Google Pixel 1 and its rendering power was not as powerful as we thought. The first thing we did was to do an art test. We figured out that Google Pixel can only handle around 75K triangles in a viewport (it can reach up to 100K but will quickly heat up the device and run out of battery). Also, it was bad at handling alpha and textures with transparency will both slow it down and have weird outlines. However, we have a forest to make. We have to make things low poly. 

Our art style went from left to right. We noticed a fact that in VR, people usually see a tree from the bottom, so we put detailed texture underneath the tree. We also made the leaves fall from the tree so the students can make the connection and identify the tree. From far away, it kept the low-poly look.

Then I made the animals accordingly. I studied animal anatomy to make the correct model and rig. Here’s some examples, and rigging the hawk on low poly model was really a hard task. As you can see, it is still imperfect.

WORK WITH HARDWARE LIMITATION – Baking Shadow and reducing Draw calls

To optimize the performance on Google Pixel, we used forward rendering with MSAA without shadow. As a side effect, we have to bake the shadow if we want the scene to look more realistic.

Instead of using Unity’s own lightmap, I did several tests and decided to bake shadow and ambient occlusion in Maya and compose them together in Photoshop. It gave me more artistic control. 

After several playtest, when our experience got more complete, we found that heating of the device became a big issue. Google Pixel would lag if it heats up too much. The way to fix it is to reduce draw calls. It is usually the most expensive part of the application.

After some research, I found there’re several ways to reduce draw calls:

  • Texture Atlases, which was too late for us to use at that time
  • In our code, do not make an instance of a material: use sharedmaterial instead.
  • Mark object as Static and use Static/Dynamic Batching

We used Daydream’s Performance Hub to monitor the heating up and Unity’s Frame Debugger to monitor the draw calls. It drops a lot after we did the modification and we were able to run our experience for 40 minutes, which is about one class’s time.