Noticed on SGI Octane with MXE graphics running IRIX 6.5.28.
When using the in-game menus to adjust graphics settings, the value of "r_colorbits" is set to either 16 or 32.
However, in code/sdl_glimp.c:GLimp_SetMode(), this snippet appears:
if (!r_colorbits->value)
colorbits = 24;
else
colorbits = r_colorbits->value;
Shortly after, the for() loop after only references the case of colorbits being 16 or 24. On IRIX, this results in 32-bit color being treated as the "not 24-bit color" case, i.e. 16-bit color. Simply changing the above fragment to:
if (!r_colorbits->value)
colorbits = 24;
else {
colorbits = r_colorbits->value;
if(colorbits == 32)
colorbits = 24;
}
...fixes the problem.
Noticed on SGI Octane with MXE graphics running IRIX 6.5.28. When using the in-game menus to adjust graphics settings, the value of "r_colorbits" is set to either 16 or 32. However, in code/sdl_glimp.c:GLimp_SetMode(), this snippet appears: if (!r_colorbits->value) colorbits = 24; else colorbits = r_colorbits->value; Shortly after, the for() loop after only references the case of colorbits being 16 or 24. On IRIX, this results in 32-bit color being treated as the "not 24-bit color" case, i.e. 16-bit color. Simply changing the above fragment to: if (!r_colorbits->value) colorbits = 24; else { colorbits = r_colorbits->value; if(colorbits == 32) colorbits = 24; } ...fixes the problem.