Development notes, Reusable code

Compute shaders and Camera.Render in Unity: workaround for re-ordering

Recently working on the 1.3 release of my Panorama Capture script I encountered a mysterious and difficult-to-reproduce issue on certain older GPUs such as NVIDIA GTX 675M where I made a sequence of calls to Camera.Render() and ComputeShader.Dispatch(), and Unity would blithely re-order them without regard for the read/write dependencies between them, resulting in very strange panorama images like this:

distorted

The function of the compute shader was to take the result of the rendering and store it in a compute buffer, so that the RenderTexture could then be re-used. This is roughly what my code looked like:

renderTexture = new RenderTexture(width, height);
ComputeBuffer computeBuffer = new ComputeBuffer(10 * width * height, 4);
copyShader.SetTexture(kernelIdx, "source", renderTexture);
copyShader.SetBuffer(kernelIdx, "result", computeBuffer);
cam.targetTexture = renderTexture;
for (int i=0; i < 10; i++) {
  // Set cam.transform.position/rotation based on i
  camera.Render();
  copyShader.SetInt("startIdx", i * width * height);
  copyShader.Dispatch(kernelIdx, (width  + threadsX - 1) / threadsX,
                                 (height + threadsY - 1) / threadsY, 1);
}
pixels = new uint[10 * width * height];
computeBuffer.GetData(pixels);

The goal is to render 10 images and copy them all into computeBuffer in order. But on some GPUs, the Render() and Dispatch() calls are done out-of-order – sometimes all the Render() calls are done before the Dispatch() calls, resulting in all 10 images in the computeBuffer being identical. Other times, the Dispatch() calls are done just one iteration early or late, shifting the resulting images up or down in the buffer, or resulting in duplication of certain images. I don’t know whether this is a Unity bug or a GPU memory model limitation, but I needed to find a workaround.

Continue reading

Advertisements
Development notes, Reusable code

Unity Bug workaround: ReadPixels() and RenderTexture.antiAliasing > 1

This is a fairly nefarious bug in Unity that was reported at Issue ID
681089 ([TEXTURE2D] TEXTURE2D.READPIXELS() FAILS IF RENDERTEXTURE HAS ANTI-ALIASING SET) and was causing some serious problems for my Panorama Capture plug-in, since it prevented me from enabling MSAA anti-aliasing. If I tried, it would cause my output renders to be solid black.

Continue reading

Reusable code

Unity Asset Store package: 360 panorama capture

aae3d339-f4d9-43a6-91f1-c4ceec0ea479_scaled

VRCHIVE is a website (currently in alpha) for sharing static 360-degree panoramas and viewing them in VR. I was asked by VRCHIVE to create a Unity script to capture monoscopic 360-degree screenshots in-game (in both traditional and VR games) and upload them to the VRCHIVE website.

In addition to or instead of uploading, it can save the panoramas to disk as image files with equirectangular and/or cubemap projections. It is designed to capture high-resolution panoramas (typically 8192×4096) without causing FPS drops or other issues that are problematic in VR applications, and can be used both by developers and by end-customers who purchase their games.

Install from Unity Asset Store (follow included README to set up)

Continue reading

Reusable code

Unity 5.x package: fade screen in/out

This is a little tiny package I whipped up for a friend who wanted to be able to fade the screen to black and then fade back in in a Unity 5.x application (this is particularly useful in VR since tracking issues are invisible when the screen is faded).

Download: Mirror 1Mirror 2Mirror 3

Usage: Create an empty game object, assign the ScreenFader script to it, and adjust the parameters. Leave “Fade in” checked. At runtime, when you toggle the “Fade in” parameter, it will either fade out (when disabling it) or fade in (when enabling it). It can be toggled from scripts, from Playmaker, or via the editor.

Continue reading

Reusable code

Efficiently finding two vectors both orthogonal/normal to a given vector

normal_diagram

This was a code snippet I did earlier for some work with light field mesh parameterizations. I had a normal vector and wanted to find two orthogonal vectors spanning the plane that the vector is normal to, in order to project another vector into it. This is an underspecified problem, as given one vector, there are many pairs of two vectors that are orthogonal to that vector and each other. This extra degree of freedom can be used to construct a numerically stable procedure that also uses less operations than computing a cross-product.

In the C# code snippet below, n is the input vector and b1 and b2 are the output vectors.

if (Math.Abs(n.x) >= Math.Abs(n.y) &&
    Math.Abs(n.x) >= Math.Abs(n.z)) {
    b1.x = n.y / n.x; b1.y = -1.0; b1.z = 0.0;
    double d = n.x * n.x + n.y * n.y;
    b2.x = n.x * n.z / d; b2.y = n.y * n.z / d; b2.z = -1.0;
}
else if (Math.Abs(n.y) >= Math.Abs(n.x) &&
         Math.Abs(n.y) >= Math.Abs(n.z)) {
    b1.x = -1.0; b1.y = n.x / n.y; b1.z = 0.0;
    double d = n.x * n.x + n.y * n.y;
    b2.x = n.x * n.z / d; b2.y = n.y * n.z / d; b2.z = -1.0;
}
else { // Math.Abs(n.z) >= Math.Abs(n.x) &&
       // Math.Abs(n.z) >= Math.Abs(n.y)
    b1.x = -1.0; b1.y = 0.0; b1.z = n.x / n.z;
    double d = n.x * n.x + n.z * n.z;
    b2.x = n.x * n.y / d; b2.y = -1.0; b2.z = n.y * n.z / d;
}
// Optionally normalize b1 and b2 to unit length here

Continue reading