Home / 2016 / December

Case study – Night Eye

Here comes my annual blog post 😛   I should make more posts but i got too busy/lazy.

Recently I was invited to take part of the Christmas experiment this year. Along with my colleague Clement and my friend Bertrand we got an idea of using abstract lines to recreate the shape of the animals in the forest.

You can check out the experiment here :

Also if you happen to have HTC VIVE, give it a try in the latest chromium build. It’s also a WebVR project.
In this project Clement has taken care of the line animation while I was focusing on the environment and the VR part. The following section is the case study of my part.


Initial Designs

Here are some pictures of initial designs  with different colour theme:




Reflection Matrix

The idea started with my experiments with reflections. I’ve always wanted to understand how to create a proper reflection and failed so many times. But then I found a really good tutorial on youtube, walk through the process step by step. I highly recommend to have a look if you are interested in implement the reflection youself. The tutorial is in Java but it covers all the concept and explained it clearly, plus the shader doesn’t change ( too much ) .

The only problem I got to follow this tutorial is the clipping plane, which I think webgl doesn’t support ( Please correct me if I am wrong ). So end up just using discard to do the simplest clipping. I’ve also find another really good presentation about rendering reflection in Webgl, in it it mentioned other ways to clip, you could have a look :



In order to get the best position and the right angle for the animals, we created a simple editor for us to place the animals and tweak the camera angles. It took a little bit extra time to build it, but it saves us a lot of time tweaking. It’s always easier when you can visualise your settings in live. After we have selected the positions and camera angles in the editor we just export a big json to the project and it’s done.



In this project we want to try the latest WebVR API. Which is really amazing ! They make it really simple to implement. The first step is to get the VRDisplay and setup the frame data holder :  

vrDisplay = navigator.getVRDisplays();
frameData = new VRFrameData();

Then in the loop you can get the data by :



For the rendering it’s become really simple. The WebVR now returns the view matrix and the projection matrix of both eyes to you.

setEye(mDir) {
    this._projection = this._frameData[`${mDir}ProjectionMatrix`];
    this._matrix = this._frameData[`${mDir}ViewMatrix`];

You can just pass it into your shader and you are ready to go. No need to setup the eye separation, no need to calculate the projection matrix. It’s just simple like that. And the code become really clean too : Set the scissoring, set the camera, render then it’s done.

const w2 = GL.width/2;

//	get VR data

//	left eye
scissor(0, 0, w2, GL.height);

//	right eye
scissor(w2, 0, w2, GL.height);


The next is to present in the VR headset. Which they make it really simple too :

vrDisplay.requestPresent([{ source: canvas }])

Then at the end of your render call, add:


Then it’s on.

However one more thing need to do but a simple one : You’ll need to use vrDisplay.requestAnimationFrame instead of window.requestAnimationFrame in order to get the right frame rate.

The WebVR api is really awesome and easy to use. There are couple things to check but I’m pretty sure you can just group them into 1 tool class. Here is a simple checklist for you: 

  • Matrices : View matrix / Projection Matrix
  • Scissor for Stereo Rendering
  • VR frame rate
  • Present mode for VR

And don’t forget to check out the examples from https://webvr.info/ you got everything you need to start in there.


After rendering, the next step for us is to implement the control. The interaction of our project is simple : press a button to go to next step and press another button to drag the snow particle with your hand. We are using the gamepad API with WebVR. It’s really straightforward. Start with : 


To get your gamepads. You might get multiple gamepads so do a check get the one you want. After this for the position and orientation are in the gamepad.pose. The button states are in the gamepad.buttons. And these are everything you need to create the interactions.



It has been a lot of fun to work on this project with friends. And a good challenge too for learning and using the latest WebVR API. Again like I mentioned they’ve made the API so easy to use and recommend everyone to give it a try. I am really surprised by it and also how little time it took me to convert my old projects into WebVR. If you are interested in the code, it’s here : https://github.com/yiwenl/Christmas_Experiment_2016/

So let’s it, hope you enjoy the read and I wish you a merry xmas and happy new year !



P.S. Some behind the scenes for the commits 😀