Reusable code

Unity Asset Store package: 360 panorama capture


VRCHIVE is a website (currently in alpha) for sharing static 360-degree panoramas and viewing them in VR. I was asked by VRCHIVE to create a Unity script to capture monoscopic 360-degree screenshots in-game (in both traditional and VR games) and upload them to the VRCHIVE website.

In addition to or instead of uploading, it can save the panoramas to disk as image files with equirectangular and/or cubemap projections. It is designed to capture high-resolution panoramas (typically 8192×4096) without causing FPS drops or other issues that are problematic in VR applications, and can be used both by developers and by end-customers who purchase their games.

Install from Unity Asset Store (follow included README to set up)


Other features including stereoscopic capture and GPU-based fast reprojection are planned for the future. Please let me know any questions or suggestions you have, or if you encounter problems!


3 thoughts on “Unity Asset Store package: 360 panorama capture

  1. Do you see a way to make a sort of interactive cinematic experience by combining a series of 360 videos together if they were wrapped within some sort of lightweight app layer that could take touch input based up gaze regions? I would imagine that file sizes might be an issue to have a fully interactive type of narrative. But since you’re diving into making videos out of Unity scenes, I wonder if this approach would open the doors to making cinematic VR experiences a bit more interactive.


    1. I think this is a good idea. I’ve imagined an experience in which you have a set of waypoints and travel between them using short video segments, then have a looping video for each waypoint. It’s a lot of filesize, but the fidelity could be very high (even live action) at very low GPU requirements. You can also use green screen techniques to create 360 videos with alpha layers, which get superimposed on top of other videos – with proper stereoscopy this can be quite convincing, and if you additionally infer depth buffers (to get an RGBD video) you can even handle things like occlusion correctly. These ideas would work as well in traditional monitor-based games but interactive movie games haven’t been really explored or expanded much since the 90s – here is a good video on the history:

      Liked by 1 person

      1. Awesome video, and a great overview of the history of interactive movie games.
        Yeah, I think there’s a lot of potential there for some interesting high-fidelity VR experiences that are a blended mix of agency and narrative. It’s an interesting to consider dynamically swapping out the green screen alpha layers with dynamic content. There’s a lot of potential there, and the biggest concern would be file sizes. So perhaps going the lower-fidelity route would help with maximizing the file compression and allow for longer and more rich experiences. The best part could be that it might open up types of experiences on mobile that would otherwise only be possible on a desktop machine. Especially if you’re able to get the stereoscopy working correctly without positional tracking and within the existing yaw & pitch-only rotation constraints of existing 360-video.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s