Discussion:
Exact meaning of wxGL_FLAGS?
ardi
2014-07-18 08:59:55 UTC
Permalink
While looking at OpenGL details in an app which is otherwise running fine,
I found something quite strange: The OpenGL view has a depth buffer of only
16 bits. I didn't notice it before because the view doesn't show a model
prone to depth precision issues, so 16 bit is enough, but it's anyway
strange.

This is on OSX, but I believe it will be the same on other platforms. It
was while asking for a canvas with double buffer, RGBA mode, alpha channel,
depth buffer, and without stencil. I checked this iMac has visuals with
8/8/8/8 RGBA, double buffer, and 32 bit depth buffer.

So I took a look at how I define wxGL_FLAGS and this seems to be the cause,
because, according to the docs:

"WX_GL_DEPTH_SIZE Specifies number of bits for Z-buffer (typically 0, 16 or
32)."

But however, I've always understood DEPTH_SIZE with the UNIX (GLX) meaning.
According to UNIX manpages:

"GLX_DEPTH_SIZE Must be followed by a nonnegative minimum size
specification. If this value is zero, visuals with no depth buffer are
preferred. Otherwise, the largest available depth buffer of at least the
minimum size is preferred."

So, obviously, I was requesting WX_GL_DEPTH_SIZE with just 1 bit, because
that's the way I've always got the largest available depth buffer on the
system. With UNIX systems this worked always fine (both on SGI IRIX, Linux,
etc).

Is there anyway for getting with wxWidgets a visual with the largest depth
buffer that has at least 8/8/8/8 RGBA bits? Well, yes, I can do some code
calling IsDisplaySupported
<http://docs.wxwidgets.org/trunk/classwx_g_l_canvas.html#aea68f828d3673d1c4d4f1a8e27abbc90>()
on a loop but I don't want to reinvent the wheel, and
 it's not clear
either when to end the loop, as some graphics cards have more bits than
others, and future hardware can have even more bits


ardi
--
Please read http://www.wxwidgets.org/support/mlhowto.htm before posting.

To unsubscribe, send email to wx-users+***@googlegroups.com
or visit http://groups.google.com/group/wx-users
Vadim Zeitlin
2014-07-18 10:38:16 UTC
Permalink
On Fri, 18 Jul 2014 01:59:55 -0700 (PDT) ardi wrote:

a> "WX_GL_DEPTH_SIZE Specifies number of bits for Z-buffer (typically 0, 16 or
a> 32)."
a>
a> But however, I've always understood DEPTH_SIZE with the UNIX (GLX) meaning.

I'm not sure about the details, but WX_GL_DEPTH_SIZE *is* exactly the same
as GLX_DEPTH_SIZE, i.e. whenever you have (WX_GL_DEPTH_SIZE, N) in
wxGLCanvas attributes, you end up with (GLX_DEPTH_SIZE, N) in the
corresponding glX() function call. So if you want to use GLX_DEPTH_SIZE of
1, just use 1.

Of course, it will then be also used as a value for WGL_DEPTH_BITS_ARB
under Windows and I don't know if it's a good idea there (the default value
is 16). But in any case, this is a purely OpenGL question, wxWidgets does
nothing here.

Regards,
VZ
--
TT-Solutions: wxWidgets consultancy and technical support
http://www.tt-solutions.com/
ardi
2014-07-18 20:11:19 UTC
Permalink
Post by Vadim Zeitlin
Of course, it will then be also used as a value for WGL_DEPTH_BITS_ARB
under Windows and I don't know if it's a good idea there (the default value
is 16). But in any case, this is a purely OpenGL question, wxWidgets does
nothing here.
Well, actually, wxWidgets does something, at least on OSX:

In src/cocoa/glcanvas.mm you can read this line:
data[p++] = NSOpenGLPFAMinimumPolicy; // make _SIZE tags behave more like
GLX

However, I think that might be a typo, because NSOpenGLPFAMinimumPolicy
returns a visual with the smallest size (provided it has at least the
requested size). OTOH, according to Apple docs, NSOpenGLPFAMaximumPolicy
returns the largest size buffers when you request a nonzero size. AFAIK,
this would be the GLX behaviour.

But, although this would make OSX and GLX behave more consistent, I've no
idea about MSW. As you pointed out, what's the exact meaning
for WGL_DEPTH_BITS_ARB? I was unable to find it. The only thing I found, at
the OpenGL spec, is that WGL_DEPTH_BITS_ARB is specified by a "minimum
criteria". But what does "minimum criteria" mean? No idea.

ardi
--
Please read http://www.wxwidgets.org/support/mlhowto.htm before posting.

To unsubscribe, send email to wx-users+***@googlegroups.com
or visit http://groups.google.com/group/wx-users
ardi
2014-07-18 21:40:18 UTC
Permalink
[...]
First of all, I really meant src/osx/cocoa/glcanvas.mm (I misspelled the
path, typing the old Cocoa path).

In addition to what I said in my previous post
regarding NSOpenGLPFAMinimumPolicy and NSOpenGLPFAMaximumPolicy, I also
believe NSOpenGLPFAColorSize and NSOpenGLPFAAccumSize are not treated
correctly. In glcanvas.mm they're treated like if they meant bits per
component. However, it seems they mean bits per pixel.

For example, just see this stackoverflow question:
http://stackoverflow.com/questions/15398419/meaning-of-nsopenglpfacolorsize-for-nsopenglpixelformat

Or this implementation of the accum buffer in SDL1:
http://hg.libsdl.org/SDL/diff/416158ec61a0/src/video/quartz/SDL_QuartzGL.m

Because of all this, I've tuned my src/osx/cocoa/glcanvas.mm this way:

a) Use NSOpenGLPFAMaximumPolicy instead of NSOpenGLPFAMinimumPolicy.

b) Create some local variables for min_red, min_green, min_blue,
min_accum_red, etc... so that the attribs values are temporally stored in
them rather than written directly in the data[] array. Then, when all
attribs have been read, the NSOpenGLPFAColorSize and NSOpenGLPFAAccumSize
attribs are set as the sum of the proper components (only if the sum is
nonzero).

c) In addition to that, I also wrote that if WX_GL_RGBA is specified but no
values were given to min_red, min_green, and min_blue, then I assign
to NSOpenGLPFAColorSize a value of 1 (i.e.: if the user wants RGB without
saying a minimum number of bits, just allocate the biggest RGB buffer
available -because remember that NSOpenGLPFAMaximumPolicy is in order- ).

These changes make glcanvas.mm work as I'd expect on OSX (ie: consistent
with GLX). However, I've no idea if this enhances or worsens consistency
with MSW.

ardi
--
Please read http://www.wxwidgets.org/support/mlhowto.htm before posting.

To unsubscribe, send email to wx-users+***@googlegroups.com
or visit http://groups.google.com/group/wx-users
Stefan Csomor
2014-07-19 05:41:54 UTC
Permalink
Hi Ardi
Post by ardi
First of all, I really meant src/osx/cocoa/glcanvas.mm (I misspelled the
path, typing the old Cocoa path).
In addition to what I said in my previous post regarding
NSOpenGLPFAMinimumPolicy and NSOpenGLPFAMaximumPolicy, I also believe
NSOpenGLPFAColorSize and NSOpenGLPFAAccumSize are not treated correctly.
In glcanvas.mm they're treated like if they meant bits
per component. However, it seems they mean bits per pixel.
http://stackoverflow.com/questions/15398419/meaning-of-nsopenglpfacolorsiz
e-for-nsopenglpixelformat
http://hg.libsdl.org/SDL/diff/416158ec61a0/src/video/quartz/SDL_QuartzGL.m
a) Use NSOpenGLPFAMaximumPolicy instead of NSOpenGLPFAMinimumPolicy.
b) Create some local variables for min_red, min_green, min_blue,
min_accum_red, etc... so that the attribs values are temporally stored in
them rather than written directly in the data[] array. Then, when all
attribs have been read, the NSOpenGLPFAColorSize
and NSOpenGLPFAAccumSize attribs are set as the sum of the proper
components (only if the sum is nonzero).
c) In addition to that, I also wrote that if WX_GL_RGBA is specified but
no values were given to min_red, min_green, and min_blue, then I assign
to NSOpenGLPFAColorSize a value of 1 (i.e.: if the user wants RGB without
saying a minimum number of bits,
just allocate the biggest RGB buffer available -because remember that
NSOpenGLPFAMaximumPolicy is in order- ).
These changes make glcanvas.mm work as I'd expect on OSX (ie: consistent
with GLX). However, I've no idea if this enhances or worsens consistency
with MSW.
thanks for getting to the core of this this,

could you please open a bug report on trac and attach your patch ?

Thanks,

Stefan
--
Please read http://www.wxwidgets.org/support/mlhowto.htm before posting.

To unsubscribe, send email to wx-users+***@googlegroups.com
or visit http://groups.google.com/group/wx-users
Vadim Zeitlin
2014-07-19 16:05:37 UTC
Permalink
On Fri, 18 Jul 2014 13:11:19 -0700 (PDT) ardi wrote:

a> Well, actually, wxWidgets does something, at least on OSX:
a>
a> In src/cocoa/glcanvas.mm you can read this line:
a> data[p++] = NSOpenGLPFAMinimumPolicy; // make _SIZE tags behave more like
a> GLX

Sorry, I somehow forgot that you were using wxOSX (although I should have
realized this from the previous thread...).

a> However, I think that might be a typo, because NSOpenGLPFAMinimumPolicy
a> returns a visual with the smallest size (provided it has at least the
a> requested size). OTOH, according to Apple docs, NSOpenGLPFAMaximumPolicy
a> returns the largest size buffers when you request a nonzero size. AFAIK,
a> this would be the GLX behaviour.

Yes, I think you're right.

a> But, although this would make OSX and GLX behave more consistent, I've no
a> idea about MSW. As you pointed out, what's the exact meaning
a> for WGL_DEPTH_BITS_ARB? I was unable to find it. The only thing I found, at
a> the OpenGL spec, is that WGL_DEPTH_BITS_ARB is specified by a "minimum
a> criteria". But what does "minimum criteria" mean? No idea.

I didn't test, but from the description it would seem that it requests at
least that many bits of the depth (z-axis) buffer, doesn't it?

Regards,
VZ
--
TT-Solutions: wxWidgets consultancy and technical support
http://www.tt-solutions.com/
Loading...