Chapter 17. OpenGL and 3D Rendering

Table of Contents

Storyboard 3D Rendering Model
3D Rendering Fundamentals
The Scene Graph and Transformations
Material Support
Animation and Variable Support
Mapping FBX Animation data into meaningful structures
Support for Animation Takes
Troubleshooting 3D Problems
Working with OpenGL Shaders, Transforms and Compressed Textures
3D Transforms and Custom Shaders
Custom Shader Support
Compressed Textures

Storyboard 3D Rendering Model

3D Rendering Fundamentals

At the most basic level, rendering of 3D content is accomplished by using matrix and vector mathematics to transform points and directions between various coordinate spaces.

Understanding a few of the underlying concepts will help a designer make informed decisions when configuring 3D Model render extensions in Storyboard Designer. Below we will explain the coordinate spaces that are applicable to 3D rendering in Storyboard and explain how they relate to the properties of the 3D Model render extension.

World space is a three dimensional space that serves as the basis for defining all the other coordinate spaces. The locations of the camera and model in the 3D Model render extension properties are coordinates in world space. It is important to note that each 3D Model render extension instance references its own model data and is effectively a 2D portal into a distinct three dimensional world.

In Storyboard, we define the default position and orientation of the camera to be at the origin of World space and looking down the World space negative z-axis. There are 2 primary camera modes which determine the effect of the Camera parameters on defining View space (also called Camera space).

In “Orbit” mode, the Azimuth and Elevation parameters first rotate View space around the World space y-axis and x-axis (respectively). The camera X, Y, Z position then position the camera in this rotated space. By defining View space using transformations in this order, we can achieve a neat effect. If we set only the Z position of the camera, Azimuth and Elevation now spin the camera around the World space origin, with the camera always looking toward the origin.

In “Fly” mode, the camera X, Y, Z position define the position of the origin (0,0,0) of View space within world space. Azimuth and Elevation now rotate the View space around the y-axis and x-axis (respectively) of View space. This allows a camera that can freely look “away” from the World space origin in any direction.

You may notice in the above description that the above descriptions of Azimuth and Elevation are in terms of y-axis and x-axis, and not the z-axis. In order to simplify rotations, Storyboard does not allow the camera to be “rolled” along the View z-axis.

The 3D Model render extension takes as a parameter a single model file per instance. Storyboard supports .obj and .fbx files as 3D model input. Since FBX file support is provided by a closed-source library maintained by Autodesk. This library has support for limited number of platforms and architectures. To help mitigate this limitation, and provide an opportunity for offline optimization of model data, FBX files are converted on import to SSG (Storyboard Scene Graph) files.

The Scene Graph and Transformations

We support a hierarchical scene graph for defining a 3D scene. We define the Node to be the basic building block. Currently a node may be a:

  • Group

  • Mesh

  • Light

All nodes inherit the transform (coordinate space) of their parent.

Groups define a set of children nodes, and a coordinate space which all children nodes inhabit.

The complete order of transformations within a Group node is the following:

  • Inherited transform from parent

  • Local (bind) transform from scene graph

  • Deformation transforms

    • Translation

    • Rotation (around X-axis, followed by Y-axis and finally Z-axis)

    • Scaling

Meshes and Lights are leaf nodes.

Meshes define:

  • Geometry

  • Material information related to portions of the geometry.

Lights may be one of 2 types:

  • Directional, best used for modelling distant constant light sources, such as the sun

  • Point (or omni-directional) lights, best used for lights that emanate from a position, such as a lamp, etc.

Material Support

We support the following attributes for a material applied to a section of geometry:

  • Ambient color

  • Diffuse color

  • Specular color (and a specular exponent)

  • Emissive color

  • Alpha (transparency, 0.0 completely transparent, 1.0 completely opaque)

We also support a diffuse texture map, which is currently used as a texture source for both diffuse and ambient color.

We store the following additional information, but do not have any support for rendering at this time:

  • Reflectivity coefficient

  • Separate ambient map

  • Specular map

  • Emissive map

  • Bump map

  • Normal map

  • Reflection map (expected to take the form of plane, cube or spherical mapping of reflection information)

Animation and Variable Support

Information on what is possible with the FBX file format is included below, but the bottom line is that almost all 3D Modeling DCC tools dispense with almost all of this structure and bake the movements down into a single take/layer, so in Storyboard, for simplicity, we define a 3D scene animation to have:

  • n Animation Channels, containing:

    • n Animation Curves

Channels are defined as a node/transform pair, such as "FrontDriversSideDoor"/RX (x rotation). These map to rows in the animation timeline in Designer.

Curves are defined by key frames, and include a key frame time and value for the transform. These will map to the endpoints of Animation Steps in Designer.

The Storyboard variables that are automatically associated with nodes in the 3D model are generalized as the following variable:

RotationRX, RY, RZ
ScaleSX, SY, SZ
TranslationTX, TY, TZ
Hidden Statehidden

Mapping FBX Animation data into meaningful structures

Animation data specified in an FBX file for a scene takes the following structure:

  • n Animation Takes, containing:

    • n Animation Layers, containing:

      • n Animation Channels, containing:

        • n Animation Curves

Animation Takes (also called Stacks internally by FBX, but nowhere else it seems) define discrete animations that you might want to play. These quite easily map to our concept of animation clips in Storyboard Designer. Unfortunately, support for defining Animation Takes in many DCC tools is somewhat limited, see the note below. You can think of a Take in the film sense, "Action! ... do stuff, do stuff, do stuff... Cut!".

Animation Layers define a set of curves that you may want to play in parallel with another layer, allowing you to essentially modulate the defined motion of another layer. An example would be a sphere moving along a path (layer 1), while bouncing up and down (layer 2). These don't really map to anything in Storyboard, we would likely just import multiple layers of animation motion into a single clip.

Even though Animation Layers have little meaning to us, they are important because they are the container for a set of channels/curves.

Animation channels define what precisely we are deforming. These map to rows in our animation timeline. An example here would be "FrontDriversSideDoor, X rotation".

Each channel as mentioned above has a set of Curves, which basically map to the ends of Animation Steps in Storyboard. The curves are defined using key frames, with a time and a value.

In reality, most DCC tools (except MotionBuilder), will require any use of layers to be baked down into a single layer, and as mentioned above (and expanded on below), multiple takes are not natively supported either.

Support for Animation Takes

While FBX files can have multiple animation takes embedded in a single file, 2 of the most popular DCC tools, Maya and 3DS Max do not ship with the functionality to export the Animation Take data. These tools have a single animation timeline, and export the animation data a single take.

Artists desiring to specify multiple animations relating to a single model or scene have a few options, but all of them essentially defer defining this data to further down the asset pipeline.

The typical pipeline workflows are:

  1. Export each separate animation into a separate FBX file. There are a whole bunch of problems with this idea.

  2. Export modelling data to Autodesk MotionBuilder (previously called FilmBox, the origin of the FBX format) or another equivalent tool and use these to define the desired takes. These will import cleanly into separate takes.

  3. Max and Maya have a paid plugin (fairly inexpensive - $9 USD on TurboSquid as of the time of writing) allowing the artist to define multiple takes from the Maya and Max animation timeline. These are fairly simple tools, just defining a portion of the timeline to be each take, but are sufficient for most purposes.

  4. Define all "takes" on a single timeline (with spacers between the desired takes) and export it as is. Use tools from the target middleware (Storyboard Designer, in our case), if they exist, to "slice" the animation into separate animations.

In order to support workflow 4, we would have to support the concept of slicing/splitting the incoming animations. As of Storyboard 4.2, this functionality is not supported.