I attended SVVR 2015 this week which was lots of fun! I saw panels and talks, visited the expo, had lunch with the indie devs, and got my first demo of the eye-tracking VR headset FOVE which I subsequently backed in their Kickstarter. Just like last year I took lots of photos (Karl Krantz actually gave me a free press pass!) and I’ve started uploading them to a new eVRdayVR Flickr account. It’ll take a while to get through them, processing photos is time-consuming as hell, but when I’m done I’ll post them to reddit.
Meanwhile, I got a VR demo going of my 360 panorama capture Unity plugin. You walk around Viking Village in DK2, and at any time can press P to capture a 360 panorama still. It fades to black, captures the screens, fades back in, and then continues processing on CPU for about a minute (converting to equirectangular projection) before saving/uploading. It’s fully asynchronrous so you can play while it’s processing. You can try out the demo here.
I had an issue where as soon as it finished the reprojection processing it would stutter severely for a couple seconds. Eventually I pinned this down to the Texture2D.Apply() call, which I was calling after executing all the SetPixel() calls on my final destination texture which contained the reprojected image. On such a large texture (8192×4096 in this demo) this is a very expensive call, taking multiple frames at the very least. I circumvented the issue by simply writing my result image into a byte array instead of a Texture2D, which is then converted directly into a C# Bitmap via Bitmap.LockBits() (I had to import the System.Drawing.dll assembly, see this post). I can then save out the result to an image file on another thread (EncodeToJPEG() only works on the main thread in Unity, but Bitmap.Save() can run wherever). This has made the experience a lot smoother, with no detectable lag during processing even in the Rift.
I also used my plugin along with some basic replay functionality to produce a short 360 video of some Unity content. I recorded the position of the player in each FixedUpdate() for replay, and used ParticleSystem.Simulate() for making the torches and such go one frame at a time. I used the GPU reprojection which is more resource intensive but about 30 times faster. Even with this it still took about an hour to complete processing for this simply 17 second video – partly because I was capturing everything in 8192×4096 and downscaling to 3840×1920, partly just because there’s a lot more room for optimization. But it’s still super cool to be able to share a complete play environment on YouTube, and to re-play the 360 video in VR in Virtual Desktop. I think this is the beginning of how Let’s Plays of actual full games will be done in VR.