Bug 4244 - When r_colorbits == 32, GLimp_SetMode() uses wrong bits
Status: RESOLVED FIXED
Alias: None
Product: ioquake3
Classification: Unclassified
Component: Platform
Version: GIT MASTER
Hardware: All IRIX
: P3 normal
Assignee: Zachary J. Slater
QA Contact: ioquake3 bugzilla mailing list
URL:
Depends on:
Blocks:
 
Reported: 2009-07-15 05:01 EDT by Patrick Baggett
Modified: 2009-09-14 23:06:53 EDT
1 user (show)

See Also:



Description Patrick Baggett 2009-07-15 05:01:55 EDT
Noticed on SGI Octane with MXE graphics running IRIX 6.5.28.

When using the in-game menus to adjust graphics settings, the value of "r_colorbits" is set to either 16 or 32.

However, in code/sdl_glimp.c:GLimp_SetMode(), this snippet appears:

    if (!r_colorbits->value)
        colorbits = 24;
    else
        colorbits = r_colorbits->value;

Shortly after, the for() loop after only references the case of colorbits being 16 or 24. On IRIX, this results in 32-bit color being treated as the "not 24-bit color" case, i.e. 16-bit color. Simply changing the above fragment to:

    if (!r_colorbits->value)
        colorbits = 24;
    else {
        colorbits = r_colorbits->value;
        if(colorbits == 32)
            colorbits = 24;
    }

...fixes the problem.
Comment 1 Ryan C. Gordon 2009-09-14 22:51:27 EDT
Fixed in svn revision #1610.

--ryan.
Comment 2 Patrick Baggett 2009-09-14 23:06:53 EDT
Thanks!

Patrick

(In reply to comment #1)
> Fixed in svn revision #1610.
> 
> --ryan.