123 lines
15 KiB
Plaintext
123 lines
15 KiB
Plaintext
******************************
|
|
Texture mapping PolyVox meshes
|
|
******************************
|
|
The PolyVox library is only concerned with operations on volume data (such as extracting a mesh from from a volume) and deliberatly avoids the issue of rendering any resulting polygon meshes. This means PolyVox is not tied to any particualr graphics API or rendering engine, and makes it much easier to integrate PolyVox with existing technology, because in general a PolyVox mesh can be treated the same as any other mesh. However, the texturing of a PolyVox mesh is usually handled a little differently, and so the purpose of this document is to provide some ideas about where to start with the process of tesxture mapping PolyVox meshes.
|
|
|
|
This document is aimed at readers in one of two positions:
|
|
1) You are trying to texture 'Minecraft-style' terrain with cubic blocks and a number of different materials.
|
|
2) You are trying to texture smooth terrain produced by the Marching Cubes (or similar) algoritm.
|
|
These are absolutly not the limit of PolyVox, and you can choose much more advanced rendering approaches if you wish. For example, in the past we have texture mapped a voxel Earth from a cube map and used an animated *procedural* texture (based on Perlin noise) for the magma at the center of the earth. However, if you are aiming for such advanced techniques then we assume you understand the basics in this document and have enough knowledge to expand the ideas yourself. But do feel free to drop by and ask questions on our forum.
|
|
|
|
Traditionally meshes are textured by providing a pair of UV texture coordinates for each vertex, and these UV coordinates determine which parts of a texture maps to each vertex. The process of texturing PolyVox meshes is more complex for a couple of reasons:
|
|
1) PolyVox does not provide UV coordinates for each vertex.
|
|
2) Voxel terrain (particulaly Minecraft-style) often involves many more textures than the GPU can read at a time.
|
|
By reading this document you should learn how to work around the above problems.
|
|
|
|
Mapping textures to mesh geometry
|
|
================================
|
|
The lack of UV coordinates means some creativity is requried in order to apply texture maps to meshes. But before we get to that, we will first try to unserstand the rational behind PolyVox not providing UV coordinates in the first place. This rational is different for the smooth voxel meshes vs the cubic voxel meshes.
|
|
|
|
Rational
|
|
--------
|
|
The problem with texturing smooth voxel meshes is simply that the geometry can get very complex and it is not clear how the mapping between mesh geometry and a texture should be performed. In a traditional heightmap-based terrain this relationship is obvious as the texture map and heightmap simply line up diretly. But for more complex shapes some form of 'UV unwrapping' is usually perfomred to define this relationship. This is usually done by an artist with the help of a 3D modeling package and so is a semi-automatic process, but is time comsuming and driven by the artists idea of what looks right for their particualar scene. Even though fully automatic UV unwrapping is possible it is usualy prohibitavly slow.
|
|
|
|
Even if such an unwrapping was possible in a reasonable timeframe, the next problem is that it would be invalidated as soon as the mesh changed. Enabling dynamic terrain manipulation is one of the appealing factors of voxel terrain, and if this use case were discarded then the user may as well just model their terrain in an existing 3D modelling package and texture there. For these reasons we do not attempt to generate and UV coordinates for smooth voxel meshes.
|
|
|
|
The rational in the cubic case is almost the opposite. For Minecraft style terrain you want to simply line up an instance of a texture with each face of a cube, and generating the texture coorordinates for this is very easy. In fact it's so easy that there's no point in doing it - the logic can instead be implemented in a shader which in turn allows the size of each vertex to be much smaller.
|
|
|
|
Triplanar Texturing
|
|
-------------------
|
|
The most common approach to texture mapping smooth voxel terrain is to use *triplanar texturing*. Actually we do not need to go into too much detail here as there are a number of other references available [references] but we will summarise the process.
|
|
|
|
The basic idea is to project a texture along all three main axes and blend between the three texture samples according to the surface normal. As an example, suppose that we wish to write a fragment shader to apply a single texture to our terrain, and that we have access to both the world space position of the fragment and also it's normal. Also, note that your textures should be set to wrap because the world space position will quickly go outside the bound of 0.0-1.0. The world space position will need to have been passed through from earlier in the pipeline while the normal can be computed using one of the approaches in the lighting (link) document. The shader code would then look something like this [footnote: code is untested is is simplified compared to real world code. hopefully it compiles, but if not it should still give you an idea of how it works]:
|
|
|
|
// Take the three testure samples
|
|
vec4 sampleX = texture2d(inputTexture, worldSpacePos.yz); // Project along x axis
|
|
vec4 sampleY = texture2d(inputTexture, worldSpacePos.xz); // Project along y axis
|
|
vec4 sampleZ = texture2d(inputTexture, worldSpacePos.xy); // Project along z axis
|
|
|
|
// Blend the samples according to the normal
|
|
vec4 blendedColour = sampleX * normal.x + sampleY * normal.y + sampleZ * normal.z;
|
|
|
|
Note that this approach will lead to the texture repeating once every world unit, and so in practice you may wish to scale the world space positions to make the texture appear the desired size. Also this technique can be extended to work with normal mapping though we won't go into the details here.
|
|
|
|
This idea of triplanar texturing can also be applied to the cubic meshes, though in some ways it can be considered to be even simpler. With cubic meshes the normal always points exactly along one of the main axes, and so it is not necessary to sample the texture three time and nor to blend the results. Instead you can use conditional branching in the fragment shader to determine which pair of values out of {x,y,z} should be used as the texture coordintes. Something like:
|
|
|
|
vec4 sample = vec4(0, 0, 0, 0); // We'll fill this in below
|
|
// Assume the normal is normalised.
|
|
if(normal.x > 0.9) // x must be one while y and z are zero
|
|
{
|
|
//Project onto yz plane
|
|
sample = texture2D(inputTexture, worldSpacePos.yz);
|
|
}
|
|
// Now similar logic for the other two axes.
|
|
.
|
|
.
|
|
.
|
|
|
|
You might also choose to sample a different texture for each of the axes, in order to apply a different texture to each face of your cubes. If so, you probably want to pack your differnt face textures together using an approach like those described later in this document for multiple material textures.
|
|
|
|
Using the material identifier
|
|
-----------------------------
|
|
So far we have assumed that only a single material is being used for the entire voxel world, but this is seldom the case. It is common to associate a paticular material with each voxel so that it can represent (for example) rock, wood, sand or any other material as required. The usual approach is to store a simple integer identifier with each voxel, and then map this identifier to material properties within you application.
|
|
|
|
Both the CubicSurfaceExtractor and the MarchingCubesSurfacExtractor understand the concept of a material being associated with a voxel, and they will take into account when generating a mesh. Specifically, they will both copy the material identifer into the vertex data of the output mesh, so you can pass it through to your shaders and use it to affect the way the surface is rendered.
|
|
|
|
The following code snippet assumes that you have passed the material identifier to your shaders and that you can access it in the fragment shader (see the section 'Blending between materials' to understand the implications of the interpolation which occurs). It then chooses what colour to draw the polygon based on this identifier:
|
|
|
|
vec4 fragmentColour = vec4(1, 1, 1, 1); // We'll fill this in below
|
|
if(materialId < 0.5) //Avoid '==' when working with floats.
|
|
{
|
|
fragmentColour = vec4(1, 0, 0, 1) // Draw material 0 as red.
|
|
}
|
|
else if(materialId < 1.5) //Avoid '==' when working with floats.
|
|
{
|
|
fragmentColour = vec4(0, 1, 0, 1) // Draw material 1 as green.
|
|
}
|
|
else if(materialId < 2.5) //Avoid '==' when working with floats.
|
|
{
|
|
fragmentColour = vec4(0, 0, 1, 1) // Draw material 2 as blue.
|
|
}
|
|
|
|
This is a very simple example, and such use of conditional branching within the shader may not be the best approach for your purposes as it incurs some performance overhead and becomes unweildy with a large number of materials. Other approaches include encoding a colour directly into the material identifier, or using the idenifier as an index into a texture atlas or array.
|
|
|
|
Note that PolyVox currently stores that material identifier for the vertex as a float, but this will probably change in the future to use the same type as is stored in the volume. It will then be up to you which type you pass to the GPU (older GPUs may not support integer values) but if you do use floats then watch out for precision issues and avoid equality comparisons.
|
|
|
|
Blending between materials
|
|
--------------------------
|
|
An additional complication when working with smooth voxel terrain is that it is usually desirable to blend smoothly between adjacent voxels with different materials. This situation does not occur with cubic meshes because the texture is considered to be per-face instead of per-vertex, and PolyVox enforces this by ensuring that all the vertices of a given quad have the same material.
|
|
|
|
With a smooth mesh it is possible that each of the three vertices of any given triangle have different material identifiers (see figure below). If this is not explicitely handled then the graphics hardware will interpolate these material values across the face of the triangle. This is usually not desirable because if, for example, the three material identifiers are {1, 1, 8} then the interpolated values will pass through all identifiers between one and eight, which includes a number of completely unrelated materials. Fundamentally, the concept of interpolating between material identifiers does not make sense, because if we have 1='grass', 2='rock' and 3='sand' then it does not make sense to say rock is the average of grass and sand.
|
|
|
|
There are a couple approaches we can adopt to combat this problem. One approach is to attach an alpha value to each vertex so that corners of a triangle can optionally be faded out. If a triangle has the same material value at each vertex then we also give it full alpha at each vertex and the triangle draws normally, but if it has a differnt material for each vertex then we duplicate the triangle three times (once for each material). Each new triangle should then use the same material at each vertex, this material being one of those from the original triangle. The alpha values of the vertices of the new triangles are set such that when the three triangles are drawn on top of each other with additive alpha blending, the desired smoothly shaded triangle results.
|
|
|
|
One drawback of this approach is that the mesh needs to be drawn with alpha blending enabled, which is both costly and also allows previously drawn geometry to show through. Therefore, before any alpha blended geometry is drawn, you also need to draw the triangle solidly in black (which in turn means one of your material identifier need to be reserved as solid black - we use material zero below). This whole process is rather difficult to explain, but hopefully this diagram of the inputs and outputs makes it clearer:
|
|
|
|
//NOTE - Actually this doesn't work. The black triangles need different blending so must be drawn seperatly. we should recommend first off all that the mesh is split into single/multi material parts, and then come on to the approaches for multimaterial handling. Also consider the issue of whether the rendering order of triangles is guarenteed... but does this matter if black is a seperate pass? Also, what about drawing white triangles and then using multiplicative blending? No, this doesn't help because the white triangles would still need to be drawn seperatly as mutiplying by white is no better than adding black. Subtractive blending maybe? I don't think so... Custom blend process? Using alpha anc colour sperately? Check the options here.
|
|
|
|
//MaterialBlending diagram
|
|
|
|
//Add kers approach.
|
|
|
|
Both of these processes are a lot of work, and add costs such as increasing the amount of geometry being rendered, increasing the size of the vertex data, and adding the cost of alpha blending. However, it can be noted that most of the triangles in a given smooth mesh only use a single material, and so it may well be useful to actually split the input mesh into two parts. The first part would contain only those triangles which use a single material whilst the second prt would contain only those triangles which use two or three materials. This limits the amount of geometry to which we need to apply the complex operations described previously, but on the other hand it increases the batch count as a single mesh is now rendered as two batches instead of one.
|
|
|
|
Actual implementation of these material blending approaches is left as an excercise to the reader, though it is possible that in the future we will add some utility functions to PolyVox to assist with tasks such as splitting the mesh or adding the extra vertex attributes. Our test implementations have performed the mesh processing on the CPU before the mesh is uploaded to the graphics card, but it does seem like there is a lot of potential for implementing these approaches in the geometry shader.
|
|
|
|
Storage of textures
|
|
===================
|
|
The other major challenge in texturing voxel based geometry is now we handle the large number of textures which such environments often require. As an example, a game like Minecraft has hundreds of different material types each with their own texture. The traditional approach to mesh texturing is to bind textures to *texture units* on the GPU before rendering a batch, but even modern GPUs only allow between 16-64 textures to be bound at a time. In this section we discuss various solutions to overcoming this limitation.
|
|
|
|
Seperate texture units
|
|
----------------------
|
|
Before we make things uncessesarily complicated, you should consider whether you do actually need the hundreds of textures discussed earlier. If you actually only need a few textures then the simplest solution may indeed be to pass them in in different texture units. You can then choose between the textures using a series of if statements, or a switch statement if the material identifiers are integer values. Keep in mind that you may need to reserve some texture units for additional data such as normal maps or shadow maps.
|
|
|
|
Splitting the mesh
|
|
------------------
|
|
Texture atlases
|
|
---------------
|
|
Volume slices
|
|
-------------
|
|
Texture arrays
|
|
--------------
|
|
Bindless rendering
|
|
------------------ |