Tutorials


These tutorials are targeted at people with a reasonable web programming knowledge (HTML and JavaScript languages) and a basic web 3D programming knowledge (WebGL API).


Tutorial 6 : Textures 2D

This tutorial introduces the SpiderGL wrapper for the WebGL textures, showing an usage example of our library Texture2D class.

View DemoDownload Source

In this lesson will be show how to apply textures to an object on the scene employing SpiderGL.
In particular we will use the methods in the Texture2D class of the WebGL library component, to instantiate and then handle a plain texture, providing to the renderer various mip map levels, manually created.
The final aim of this tutorial example will be to render a textured object (a stretched square), animating which you'll may observe the smooth transition between the associated (eight) levels of multum in parvo mapping, each one colored in a different way.

Manage a textures implies different changes to the source code, if compared with that one of the previous tutorial. As usual all the interesting SpiderGL new code is placed in the "CanvasHandler.prototype" function, where in this lesson you can find the classic basic routine to initialize the stage, draw the scene, and catch the user interactions with the HTML5 canvas.
More specifically the modification will affect the shaders, the buffers and the textures specifications in the "onInitialize" block, the textures attribute pointer setup and the created textures binding in the "onDraw" routine, and some key typing related code lines in the event handler methods.

As always start with order, beginning with the structures at the top of the code listing, and i.e. with the shaders. The following code blocks, which includes the vertex and the fragment shaders, in addiction to the shader program set up, actually brings only few changes compared to as seen so far.
These little modifications are those classically required to pass the textures to the shaders (and so to the graphics card), and no SpiderGL calls are involved in this program section.
Anyhow let's a look to the news, starting with the vertex shader: compared to the previous example were added the attribute and varying variables "aTextureCoord" and "vTextureCoord", useful to manage the texture coordinates (which will be stored in an appropriate SpiderGL vertex buffer) to apply at each vertex, and naturally were removed the relatives attribute and varying variables related to the vertices colors.

var vsSource = "\
  precision highp float;                                 \n\  
                                                         \n\
  attribute vec3 aPosition;                              \n\
  attribute vec2 aTextureCoord;                          \n\
                                                         \n\
  uniform   mat4  uModelViewProjectionMatrix;            \n\
                                                         \n\
  varying vec2 vTextureCoord;                            \n\
                                                         \n\
  void main(void) {                                      \n\
    vTextureCoord = aTextureCoord;                       \n\
                                                         \n\
    gl_Position = 
      uModelViewProjectionMatrix * vec4(aPosition, 1.0); \n\
  }                                                      \n\
";

Next, in the fragment shader, of course we find the same varying variable for the textures coordinates (again in place of the old color varying variable), and more we have a new uniform related to the bi-dimensional textures sampler "uSampler", that how is know in GLSL specified the textures own type and will be used as input in the texture lookup function "texture2D" placed in the "main" statement (and able to access the image data for the calculation of the color of every fragment passed to the shader):

var fsSource = "\
  precision highp float;                                 \n\
                                                         \n\
  varying vec2 vTextureCoord;                            \n\
                                                         \n\
  uniform sampler2D uSampler;                            \n\
                                                         \n\
  void main(void) {                                      \n\
    gl_FragColor = texture2D(uSampler, vTextureCoord);   \n\
  }                                                      \n\
";

Finally in the below section, about the SpiderGL shader program initialization, there is some little more change, related to the new attributes indices and to the new uniforms default values, that doesn't need explanations and close the discussion about the shaders modifications.

 	
var vShader = new SglVertexShader(gl, {source: vsSource});
var fShader = new SglFragmentShader(gl, {source: fsSource});

this.shaderProgram = new SglProgram(gl, {
  autoLink   : true, 
  shaders    : [vShader, fShader],
  attributes : {
    aPosition     : 0, 
    aTextureCoord : 1 
  },
  uniforms   : {
    uModelViewProjectionMatrix : SglMat4.identity(),
    uSampler                   : 0 			
  }
});

You can so pass now to our object buffers initialization: we will draw a simple square in the now classic way, i.e. using a SpiderGL vertex buffer (based on a JavaScript list) with the four vertex square 3D spatial positions, and a SpiderGL index buffer (based on another JavaScript list) with the sequence of vertices that will constitute the primitives (triangles as usual) which in turn will form our square.
In addition to these structures you can find in the source code a further vertex buffer used to store the texture coordinates, called "squareVertexTexCoordBuffer". This buffer, based on the JavaScript array "squareTextureCoords", holds pairs of two dimensional spatial position that will define the texture displacement on the earlier created geometry.

var squareTextureCoords = [
  1.0, 1.0,
  0.0, 1.0,
  1.0, 0.0,
  0.0, 0.0
];

this.squareVertexTexCoordBuffer = new SglVertexBuffer(gl, {
    data  : new Float32Array(squareTextureCoords),
    usage : gl.STATIC_DRAW
});

There is nothing of really new in this buffer initialization, just remember that as is customary the texture coordinates are mapped on a square with side 1, with the X and Y axis 0 in the left bottom corner.

Well, this done we can jump to the core of this tutorial, just next the index buffer declaration, i.e the real texture creation and instantiation.
After a variable setting ("levels", used to store the number of level of mip map that we will associate to our texture), we find this block of code:

this.texture = new SglTexture2D(gl, {
  width                     : null,
  height                    : null,
  data                      : null,  	
  border                    : 0, 
  type                      : gl.UNSIGNED_BYTE, 
  magFilter                 : gl.LINEAR, 
  minFilter                 : gl.LINEAR_MIPMAP_LINEAR, 
  wrapS                     : gl.CLAMP_TO_EDGE, 
  wrapT                     : gl.CLAMP_TO_EDGE, 
  autoMipmap                : false,
  generateMipmap            : false,
  flipYPolicy               : true, 
  flipY                     : true, 
  premultiplyAlphaPolicy    : true, 
  premultiplyAlpha          : false, 
  colorspaceConversionPolicy: false, 
  colorspaceConversion      : false 
});

In the above code lines is shown how to instantiate a simple texture with SpiderGL, using the appropriate constructor "Texture2D" located in the homonymous class of the WebGL component.
This method, wrapper for 2D WebGL texture objects, takes by input the usual WebGL rendering context and a huge list of optional parameter and loading function (the operation of which will not be explained here for brevity, but that is deeply treated in the online documentation).
Focus instead on the parameter really important for this tutorial. As mentioned at the beginning of this lesson we want here to apply on an object a texture that use the texture filter working with mip map levels. As the same name implies, this kind of filter, often used to avoid aliasing effects during textures minification, operates with different levels of details associated to every texture. These levels are usually automatically generated (remember that the side dimensions of each level are halved in every step of generation), but in our case, to produce a more visible effect in the passage from one mip map level to another, we will produce manually each giving a different color to every texture them associated. For this reason in the previous code block you have the parameter that sets the texture minification filter ("minFilter") fixed on the constant value reserved to the mip map filter ("gl.LINEAR_MIPMAP_LINEAR") and the parameters for the automatic generation of the mip map pyramid ("autoMipmap" and "generateMipmap") both set on "false".
What we need now therefore is to generate the different levels of mip map desired (eight in our example, as previously indicated), to attach to the dummy textures instantiate (if you look carefully the above constructor in fact, you can note that his first three basic parameter are setted to "null", allocating in this manner an empty texture with side 0).
To generate these colored levels the first thing to do is to create a JavaScript array containing the eight different RGB triplets with the (chosen by us) color value of each texture layer:

var levelColors = [
  [ 255,   0,   0 ], //red
  [   0, 255,   0 ], //green
  [   0,   0, 255 ], //blue
  [ 255, 255,   0 ], //yellow
  [   0, 255, 255 ], //cyan
  [ 255,   0, 255 ], //magenta
  [ 255, 255, 255 ], //white
  [   0,   0,   0 ], //black
  [ 100, 100, 100 ]  //grey
];

The next step is to use the values in this JavaScript list to pack the texture image pixel data related to each level.
However before it is necessary to do some other basic operation, like the following, needed to set the unpacking of pixel data from memory (specifying the alignment requirements for the start of each pixel row in memory):

gl.pixelStorei(gl.UNPACK_ALIGNMENT, true);

This done you can proceed further creating a JavaScript object containing the characteristics of each texture level that we're going to create. Exploiting the JavaScript language flexibility at this object are associated some of the SpiderGL texture parameters earlier seen, needed to define the different levels (such us the texture format, set to RGB for this example):

var texOpts = {
  internalFormat: gl.RGB,
  format        : gl.RGB,
  level         : 0,
  width         : 1 << levels,
  height        : 1 << levels
};

In the above initialization of the "texOpts" object we start to create the first level of the mip map schema (as "level" equals to 0 suggests) with the side dimensions of the image setted (using the left shift operator) to 256. These last parameters will update dynamically for each level in the "for" cycle of the below code block, where, more the lever number and texture size specification, takes place the packaging of the true texture data (as an array of integer filled with the color value previously defined):

for (var i = 0; i <= levels; ++i) {
  texOpts.data = 
    new Uint8Array(texOpts.width * texOpts.height * 3);
  for (var j = 0; j < texOpts.data.length; j += 3) {
      texOpts.data[j + 0] = levelColors[i][0];
      texOpts.data[j + 1] = levelColors[i][1];
      texOpts.data[j + 2] = levelColors[i][2];
  }
  this.texture.setImage(texOpts);

  texOpts.level++;
  texOpts.width /= 2;
  texOpts.height /= 2;
};

Note that to attach the just created level texture to the dummy texture instantiating at the beginning is made use of the "setImage" method, provided by our library to each SpiderGL texture object (this function, as the name suggests, sets the texture image taking in input the image data and the already seen appropriate parameters).

Well, at this point the SpiderGL texture is instantiated, and with it, the various mip map levels were created.
So the initialization stage is completed (the last three variable assignation can be skipped because unchanged compared to the previous tutorial), and you can pass to explain the event handler routines:

onKeyDown: function (key) {
  if ((key == "A") || (key == "Z")) {
      this.ui.animateRate = 60;
  }
},

onKeyUp: function (key) {
  if (!this.ui.isKeyDown("A") && !this.ui.isKeyDown("Z")) {
      this.ui.animateRate = 0;
  }
},

onAnimate: function (dt) {
  if (this.ui.isKeyDown("A")) this.angle -= 45.0 * dt;
  if (this.ui.isKeyDown("Z")) this.angle += 45.0 * dt;
  this.ui.postDrawEvent();
},

What you see in the above code lines is a smart way to manage the key events in an animated context: as clearly evidenced from the text the movements of the scene are related to the key "A" and "Z", whom pressed set up the User Interface field "animateRate" to 60 calls per second, and released turn back it to 0.
So once one of they keys is pressed the "onAnimate" function is performed (at the selected frequency), then the value of the "angle" variable is changed, and finally the "onDraw" routine is called.
Here it should be noted a fineness present in the "onKeyUp" function, i.e. the "if" cycle. In fact the condition of this statement, written in this way, allows to manage the release of the two keys in a right and fluid manner, even if simultaneously pressed. This behaviour would not have been possible coding the same routine in this more direct mode:

onKeyUp: function (key) {
  if ((key == "A") || (key == "Z")) {
      this.ui.animateRate = 0;
  }
},

This simple shrewdness, that here may seem trivial, is very important in applications with animations user-driven, because allows to press more keys while the action is in progress (so, in a 3D game for example, you can start running forward, then turn a corner and shoot without stopping running, instead of stop running every time you turned a corner, thing which would be extremely irritating).

OK, with this little digression ends the event handler analysis, and you can pass to observe the last routine, that one dedicated to the rendering.
All the "onDraw" routine is practically the same of the previous lesson, with very little changing that is worth report here.
For example among the model view projection matrix transformations it should be observed that the "angle" variable changing, performed with the keys related events, this time influence the model rotation around the x axis, so as to obtain the characteristic tilt movement of its.
More, jumping to the vertex buffer attribute pointer set up, it can be noted that naturally has been removed that one relative to the color buffer (present in the last lesson), and has been inserted that one relative to the SpiderGL texture coordinates ("squareVertexTexCoordBuffer"), with the appropriate parameter ("index" and "size") fixed ad hoc:

this.squareVertexTexCoordBuffer.vertexAttribPointer({
  index     : 1, 
  size      : 2, 
  glType    : gl.FLOAT, 
  normalized: false, 
  stride    : 0, 
  offset    : 0, 
  enable    : true 
});

Finally it should be emphasized that when using the textures, before to perform the final "drawElements" call, should be bind the desired texture with the express method provided from SpiderGL to his texture objects, and, of course, as shown in the following code, the unbind action should be done on the same texture at the end of its use.

this.texture.bind();

this.squareVertexIndexBuffer.drawElements({
  glMode : gl.TRIANGLES, 
  count  : 6,
  glType : gl.UNSIGNED_SHORT, 
  offset : 0 
});

this.texture.unbind();

Well, with these last little explanations finish this tutorial. Now, after have learned how to use the SpiderGL Texture2D class to instantiate a plain texture, you can run the demo and see the real effect of the linear interpolation between the various colored levels prepared for the mip map filter.
In the next tutorial will deal again with texture, this time showing how to operate with a texture cube map defined with our library.