btadyna.blogg.se

Difference between opengl es 2.0 and 3.0
Difference between opengl es 2.0 and 3.0





Using RT2, storing color data regularly as float, reading it as float texture and outputting it to the default framebuffer.

difference between opengl es 2.0 and 3.0

I have attempted to do so in three separate ways: To simplify, I'll only focus on passing RGB color data between the two stages. The first to the FBOs RTs, and then two a fullscreen quad handled by the default framebuffer. Let's say I set up the FBO with two color attachments, the first ( RT1) with internalformat GL_RGBA32UI, the second ( RT2) with GL_RGBA32F. Where it gets weird ( packHalf2x16/ unpackHalf2x16): The configured texture from (3.) is then attached to the FBOs COLOR_ATTACHMENT using glFramebufferTexture2D. I have looked over the OpenGL ES 3.0 specs, and it isn't all that clear about this.Ī texture attachment for the FBO is configured using glTexStorage2D (with target=GL_TEXTURE_2D, levels=1), which I assume is more correct to use here than glTexImage2D, as only the internalformat should matter. Firstly, I assume that the bit depth of render targets is determined and configured in the FBO, and that the precision qualifier automatically matches this, which makes me think that high, medium and low has some kind of relation to 32, 16, 8 bits of depth. Precision qualifiers are said to "aid code portability with OpenGL ES, and has no effect with regular OpenGL", but I find it difficult to understand what exactly these highp, mediump, lowp, are used for, and how they play together with the bit depth of the render targets. Is the common procedure: Attempt to create a FBO with certain specifications, followed by a test for FBO completeness? If failed: Reduce requirements and use alternate shaders that compensate for the reduced bit depth?

difference between opengl es 2.0 and 3.0

What is safe to assume about the sizes of these render targets? Can I simply assume it can have the internal format of RGBA32F (four 32bit float channels)? It seems to me that this is a crucial assumption/knowledge for shaders writing to the RTs. OpenGL ES 3.0 specs says it has support for "four or more rendering targets", but has no mention (that I could find) of the specifications of these render targets.

difference between opengl es 2.0 and 3.0

I'm writing a deferred shader targeting both OpenGL 4.3 and OpenGL ES 3.0, with the former behaving exactly as I'd expect, but the latter giving me issues I fail to identify the source of.įirst, I'll describe my understanding/confusions regarding setting up MRT FBOs for GL 4.2 and ES 3.0, and hope someone is kind enough to correct any misconceptions.

difference between opengl es 2.0 and 3.0

I would very much have wished to ask a succinct question that allows a clear answer, but I fear there are too many minor things I don't fully understand regarding FBO initialization that I need to clear up.







Difference between opengl es 2.0 and 3.0