Found on SGI Octane with MXE graphics running IRIX 6.5.28f
When using GL_SDL_GetAttribute() to find out how many R/G/B bits were in use in 16-bit color mode in GLimp_SetMode(), I found 12/12/12 (i.e. true 36-bit color)! This really is a legal color mode on this machine (as reported by glxinfo). It was selected because the only RGB visual without alpha is the 36-bit color mode. This is true on all of the SGI systems that can run Quake3, so this problem is generalized to SGI/IRIX, not just a particular graphics board.
According to the underlying function used by SDL, glXChooseVisual(), when GLX_RED_SIZE is N > 0, it chooses the **LARGEST** size that is >= N. When N == 0, the smallest size is preferred. In the case of 16-bit color, this gets R = 4, G = 4, B = 4, and A = 0 (implicitly). The largest values that meet this requirement are 12/12/12/0, which very obviously isn't 16-bit color. Similar story with 24-bit color -- selects the exact same visual.
The fix isn't super clean, but works great. Right after this part in GLimp_SetMode():
sdlcolorbits = 4;
if (tcolorbits == 24)
sdlcolorbits = 8;
Add:
#ifdef __sgi //Fix for SGIs grabbing too many bits of color
if(sdlcolorbits == 4)
sdlcolorbits = 0; //Use minimum size for 16-bit color
//Need alpha or else SGIs choose 36+ bit RGB mode
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 1);
#endif
With this small addition, the code correctly selects 16-bit and 32-bit color visuals and the performance hit of running true 36-bit color mode is removed. The colors also look, well, 16-bit again. Until SDL uses the more modern method of choosing a visual, that is glXChooseFBConfig() followed by component-by-component matching to find an exact match if possible, this would be required.
Found on SGI Octane with MXE graphics running IRIX 6.5.28f When using GL_SDL_GetAttribute() to find out how many R/G/B bits were in use in 16-bit color mode in GLimp_SetMode(), I found 12/12/12 (i.e. true 36-bit color)! This really is a legal color mode on this machine (as reported by glxinfo). It was selected because the only RGB visual without alpha is the 36-bit color mode. This is true on all of the SGI systems that can run Quake3, so this problem is generalized to SGI/IRIX, not just a particular graphics board. According to the underlying function used by SDL, glXChooseVisual(), when GLX_RED_SIZE is N > 0, it chooses the **LARGEST** size that is >= N. When N == 0, the smallest size is preferred. In the case of 16-bit color, this gets R = 4, G = 4, B = 4, and A = 0 (implicitly). The largest values that meet this requirement are 12/12/12/0, which very obviously isn't 16-bit color. Similar story with 24-bit color -- selects the exact same visual. The fix isn't super clean, but works great. Right after this part in GLimp_SetMode(): sdlcolorbits = 4; if (tcolorbits == 24) sdlcolorbits = 8; Add: #ifdef __sgi //Fix for SGIs grabbing too many bits of color if(sdlcolorbits == 4) sdlcolorbits = 0; //Use minimum size for 16-bit color //Need alpha or else SGIs choose 36+ bit RGB mode SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 1); #endif With this small addition, the code correctly selects 16-bit and 32-bit color visuals and the performance hit of running true 36-bit color mode is removed. The colors also look, well, 16-bit again. Until SDL uses the more modern method of choosing a visual, that is glXChooseFBConfig() followed by component-by-component matching to find an exact match if possible, this would be required.