Programming with DirectX : Shading and Surfaces - Implementing Texture Mapping (part 1) - 2D Texture Mapping Demo

1/21/2011 11:51:44 AM

2D Texture Mapping Demo

The demo starts by declaring a 2D vector in the vertex structure that will be used to hold the per-vertex texture coordinates. Also added to the global section is the ID3D10ShaderResourceView object, which will hold the texture, and an ID3D10EffectShaderResourceVariable object that will be used to allow the texture to be bound to a variable in the shader file. The remainder of the global section is comparable to the global section from the Transformations demo and is shown in Listing 1.

Listing 1. The Global Section from the Texture Mapping Demo

#pragma comment(lib, "d3d10.lib")
#pragma comment(lib, "d3dx10.lib")

#define WINDOW_NAME "Texture Mapping"
#define WINDOW_WIDTH 800
#define WINDOW_HEIGHT 600

// Global window handles.
HWND g_hwnd = NULL;

// Direct3D 10 objects.
ID3D10Device *g_d3dDevice = NULL;
IDXGISwapChain *g_swapChain = NULL;
ID3D10RenderTargetView *g_renderTargetView = NULL;

struct DX10Vertex

ID3D10InputLayout *g_layout = NULL;
ID3D10Buffer *g_squareVB = NULL;
ID3D10ShaderResourceView *g_squareDecal = NULL;

ID3D10Effect *g_shader = NULL;
ID3D10EffectTechnique *g_textureMapTech = NULL;
ID3D10EffectShaderResourceVariable *g_decalEffectVar = NULL;
ID3D10EffectMatrixVariable *g_worldEffectVar = NULL;
ID3D10EffectMatrixVariable *g_viewEffectVar = NULL;
ID3D10EffectMatrixVariable *g_projEffectVar = NULL;

D3DXMATRIX g_worldMat, g_viewMat, g_projMat;

In the InitializeDemo() function we add code to load the texture from a file using the Direct3DX function D3DX10CreateShaderResourceViewFromFile(), which loads the texture and creates a shader resource view in one call. Access to the shader variable is obtained by calling GetVariableByName() and sending to it the name of the texture in the shader file, which in this demo is decal. The result of the GetVariableByName() function call returns a base object that we can call AsShaderResource() to get a pointer to the object using the correct object type. A partial look at the top half of the InitializeDemo() function is shown in Listing 6.2, where only a few lines were added to support loading the texture image. Listing 6.3 shows the remainder of the InitializeDemo() function, where the surface was modified from a triangle shape to a square. Also, each vertex has a texture coordinate set attached to it.

Listing 2. The First Half of the InitializeDemo() Function
bool InitializeDemo()
// Load the shader.


#if defined( DEBUG ) || defined( _DEBUG )
shaderFlags |= D3D10_SHADER_DEBUG;

ID3D10Blob *errors = NULL;

HRESULT hr = D3DX10CreateEffectFromFile(
"TextureMapDemoEffects.fx", NULL, NULL, "fx_4_0",
shaderFlags, 0, g_d3dDevice, NULL, NULL, &g_shader, &errors,

if(errors != NULL)
MessageBox(NULL, (LPCSTR)errors->GetBufferPointer(),
"Error in Shader!", MB_OK);


return false;

g_textureMapTech = g_shader->GetTechniqueByName(

g_worldEffectVar = g_shader->GetVariableByName(

g_viewEffectVar = g_shader->GetVariableByName(

g_projEffectVar = g_shader->GetVariableByName(

g_decalEffectVar = g_shader->GetVariableByName(

// Load the texture.
hr = D3DX10CreateShaderResourceViewFromFile(g_d3dDevice,
"", NULL, NULL, &g_squareDecal, NULL);

return false;


Listing 3. The Second Half of the InitializeDemo() Function
bool InitializeDemo()

// Create the geometry.


unsigned int numElements = sizeof(layout) / sizeof(layout[0]);
D3D10_PASS_DESC passDesc;


hr = g_d3dDevice->CreateInputLayout(layout, numElements,
passDesc.pIAInputSignature, passDesc.IAInputSignatureSize,

return false;

DX10Vertex vertices[] =
{ D3DXVECTOR3( 0.5f, 0.5f, 1.5f), D3DXVECTOR2(1.0f, 0.0f) },
{ D3DXVECTOR3( 0.5f, -0.5f, 1.5f), D3DXVECTOR2(1.0f, 1.0f) },
{ D3DXVECTOR3(-0.5f, -0.5f, 1.5f), D3DXVECTOR2(0.0f, 1.0f) },
{ D3DXVECTOR3(-0.5f, -0.5f, 1.5f), D3DXVECTOR2(0.0f, 1.0f) },
{ D3DXVECTOR3(-0.5f, 0.5f, 1.5f), D3DXVECTOR2(0.0f, 0.0f) },
{ D3DXVECTOR3( 0.5f, 0.5f, 1.5f), D3DXVECTOR2(1.0f, 0.0f) }

// Create the vertex buffer.
D3D10_BUFFER_DESC buffDesc;
buffDesc.Usage = D3D10_USAGE_DEFAULT;
buffDesc.ByteWidth = sizeof(DX10Vertex) * 6;
buffDesc.BindFlags = D3D10_BIND_VERTEX_BUFFER;
buffDesc.CPUAccessFlags = 0;
buffDesc.MiscFlags = 0;

resData.pSysMem = vertices;

hr = g_d3dDevice->CreateBuffer(&buffDesc, &resData,

return false;

// Set the shader matrix variables that won't change once here.


return true;

In the RenderScene() function, one line of code was added to bind the texture object to the shader variable. This is done by calling the SetResource() function of the ID3D10EffectShaderResourceVariable object that is bound to the variable and passing to its parameter the shader resource view texture object. The texture must be set before rendering occurs so that the shaders that use the texture have access to it and its image contents. The RenderScene() function is shown in Listing 4.

Listing 4. The RenderScene() Function from the Texture Mapping Demo
void RenderScene()
float col[4] = { 0, 0, 0, 1 };

g_d3dDevice->ClearRenderTargetView(g_renderTargetView, col);


unsigned int stride = sizeof(DX10Vertex);
unsigned int offset = 0;

g_d3dDevice->IASetVertexBuffers(0, 1, &g_squareVB, &stride,


for(unsigned int i = 0; i < techDesc.Passes; i++)
g_d3dDevice->Draw(6, 0);

g_swapChain->Present(0, 0);

In the Shutdown() function, to free the texture, we call the GetResource() function on the shader resource view to gain access to the ID3D10Texture2D object. With a pointer to this object, we can free its contents by calling its Release() function. Once the texture is released, we can release the shader resource view object as well. This is shown in Listing 5.

Listing 5. The Shutdown() Function from the Texture Mapping Demo
void Shutdown()
if(g_d3dDevice) g_d3dDevice->ClearState();
if(g_swapChain) g_swapChain->Release();
if(g_renderTargetView) g_renderTargetView->Release();

if(g_shader) g_shader->Release();
if(g_layout) g_layout->Release();
if(g_squareVB) g_squareVB->Release();

ID3D10Resource *pRes;


if(g_d3dDevice) g_d3dDevice->Release();

The last file to look at is the HLSL effect file for the Texture Mapping demo. In this file a 2D texture is defined in the global section so that the shaders can have access to it. This object has the HLSL data type Texture2D. A sampler state object of type SamplerState is used to sample a pixel from the texture. When creating a SamplerState object, you can specify the filtering mode, address mode (should it repeat, not repeat, etc.), border color, minimum and maximum level of detail in the image, and maximum anisotropy, which deals with filtering quality.

In the Texture Mapping demo we set the Filter state to MIN_MAG_MIP_LINEAR, which sets the min (surfaces that are minimized), mag (surfaces that are magnified), and mip map layers to linear. Linear interpolation applied to all three of these areas is known as trilinear filtering. The min filter is the filter used on the image when the image is drawn on a surface that is smaller than the original size of the image (i.e., the image is not drawn to scale). The mag filter is used when images are drawn on surfaces that are larger than the image’s original size and thus need to be magnified. Mip filtering occurs on the mip map levels. Remember, trilinear filtering is bilinear filtering with an additional interpolation taking place between the mip maps.

The other states that are set are the AddressU and AddressV states, and they are set to wrap. Wrap means that if the texture coordinate for the U or V is over 1.0 or under 0.0, the texture will wrap around the surface.

The entire shader file used in the Texture Mapping demo is shown in Listing 6. In the vertex shader we pass along the texture coordinates to the output without performing any additional work on them because nothing needs to be done to prepare the texture coordinates for the pixel shader. In the pixel shader we call the Sample() function on the 2D texture object, and we send to it the sampler state and the vertex’s texture coordinate. Keep in mind that the sampler state tells the graphics hardware what properties you want to have on the texture (e.g., filtering etc.) when it is sampled. The return result is a color value that we can use as the output of the pixel shader. The final result is a textured surface as shown in the screenshot in Figure 1.

Listing 6. The Texture Mapping Demo’s Shader File
Texture2D decal;

SamplerState DecalSampler
AddressU = Wrap;
AddressV = Wrap;

cbuffer cbChangesEveryFrame
matrix World;
matrix View;

cbuffer cbChangeOnResize
matrix Projection;

struct VS_INPUT
float4 Pos : POSITION;
float2 Tex : TEXCOORD;

struct PS_INPUT
float4 Pos : SV_POSITION;
float2 Tex : TEXCOORD0;

PS_INPUT output = (PS_INPUT)0;

output.Pos = mul(input.Pos, World);
output.Pos = mul(output.Pos, View);
output.Pos = mul(output.Pos, Projection);

output.Tex = input.Tex;

return output;
float4 PS(PS_INPUT input) : SV_Target
return decal.Sample(DecalSampler, input.Tex);

technique10 TextureMapping
pass P0
SetVertexShader(CompileShader(vs_4_0, VS()));
SetPixelShader(CompileShader(ps_4_0, PS()));

Figure 1. Screenshot from the Texture Mapping demo.
  •  Building Out Of Browser Silverlight Applications - Using COM Interoperability and File System Access
  •  Building Out Of Browser Silverlight Applications - Controlling the Application Window
  •  iPhone 3D Programming : Adding Depth and Realism - Loading Geometry from OBJ Files
  •  iPhone 3D Programming : Adding Depth and Realism - Better Wireframes Using Polygon Offset
  •  Programming with DirectX : Textures in Direct3D 10 (part 2)
  •  Programming with DirectX : Textures in Direct3D 10 (part 1) - Textures Coordinates
  •  Programming with DirectX : Shading and Surfaces - Types of Textures
  •  iPhone 3D Programming : Adding Shaders to ModelViewer (part 2)
  •  iPhone 3D Programming : Adding Shaders to ModelViewer (part 1) - New Rendering Engine
  •  iPhone 3D Programming : Adding Depth and Realism - Shaders Demystified
  •  Programming with DirectX : Transformation Demo
  •  Programming with DirectX : View Transformations
  •  Programming with DirectX : World Transformations
  •  Programming with DirectX : Projection Transformations
  •  iPhone 3D Programming : Adding Depth and Realism - Lighting Up (part 2)
  •  iPhone 3D Programming : Adding Depth and Realism - Lighting Up (part 1)
  •  iPhone 3D Programming : Adding Depth and Realism - Surface Normals (part 2)
  •  iPhone 3D Programming : Adding Depth and Realism - Surface Normals (part 1)
  •  iPhone 3D Programming : Adding Depth and Realism - Filling the Wireframe with Triangles
  •  iPhone 3D Programming : Adding Depth and Realism - Creating and Using the Depth Buffer
    Most View
    The Ideal OS (Part 2)
    Burn-In Your CPU With Linux (Part 1)
    Windows Vista : Trim the Fat (part 2) - Start Windows in Less Time
    Two different phones, one similar beat : HTC Sensation XE vs HTC Sensation XL
    Cooler Master Cosmos 2
    Making Music : Turning your iOS device into a dedicated player
    Embarrassing Bugs (Part 2)
    Who’s Watching You? (Part 1)
    Toshiba Qosmio X770
    The 10 Things To Know Before Buying A Laptop (Part 2)
    Top 10
    Kingston Wi - Drive 128GB: Simple To Get Started
    Seagate Wireless Plus 1 TB - Streaming Videos To Various Devices
    Seagate Wireless Plus 1TB - Seagate's Second Wireless External Hard Drive
    Western Digital My Passport 2TB - The Ideal Companion For Anyone
    Lenovo IdeaTab A2109 - A Typical Midrange Android Tablet
    Secret Tips For Your Kindle Fire
    The Best Experience With Windows 8 Tablets And Hybrids (Part 2)
    The Best Experience With Windows 8 Tablets And Hybrids (Part 1)
    Give Your Browser A Health Check
    New Ways To Block Irritating Ads…