To be able to use the paletted textures you must load the texture in texture ram with the following series of operations:


int myFormat;

// myFormat is a parameter for the loading function, determined by
// the size of ShadeTable (which must be a power of 2) as follows:

switch (ShadeTable.size) {
case 1:
case 2: myFormat = GL_COLOR_INDEX1_EXT; break;
case 4: myFormat = GL_COLOR_INDEX2_EXT; break;
case 8:
case 16: myFormat = GL_COLOR_INDEX4_EXT; break;
case 32:
case 64:
case 128:
case 256: myFormat = GL_COLOR_INDEX8_EXT; break;
case 512:
case 1024:
case 2048:
case 4096: myFormat = GL_COLOR_INDEX12_EXT; break;
default: myFormat = GL_COLOR_INDEX16_EXT;
}

glTexImage2D(
GL_TEXTURE_2D,
0,
myFormat,
IndexMap.sizex,
IndexMap.sizey,
0,
GL_COLOR_INDEX,
GL_UNSIGNED_BYTE,
IndexMap.data
// target texture
// detail level
// Number of components
// Texture length...
// ...and height (both powers of 2)
// Border

// or GL_UNSIGNED_SHORT, if depending on the type of IndexMap.data
// of type (unsigned char)* or (unsigned short)*
);

From that point on, once before each frame computation, it is sufficient to quickly change just the palette of the texture: the right remapping will be done by the hardware renderer. Note that glColorTableEXT is a pointer that is filled when detecting hardware capabilities.


// ShadeTable.data, of type float*, contains the Red Green Blue Alpha components of the shade table (ordered as RGBARGBARGBA...).
// For each i, ShadeTable.data[i] ranges in [0.0 .. 1.0].

// ShadeTable.size must be a power of 2.

(*glColorTableEXT)(
GL_TEXTURE_2D,
GL_RGBA8,
ShadeTable.size,
GL_RGBA,
GL_FLOAT,
ShadeTable.data
// GLenum target,
// GLenum internalFormat,
// GLsizei width,

// GLenum type,
// const GLvoid * data
);