Friday:
meeting with the team. was intense but worth it. tim mcgraw was good, this place is throwing surprises every second.
to do :
apply to dartmouth
meet mueller and give him the books
read the 'fast algos' gaussian paper
computer graphics & related stuff
cg, animation, virtual reality, non photorealistic rendering, shading, & rendering
Friday, June 27, 2003
Thursday, June 26, 2003
Thursday:
talked to christophe.. got some input and have to get that leFohn paper working.. that will do it.
got 2 textures in with 2 tex coords .. so :)
lets go for matmul now.. cmon.
Wednesday, June 25, 2003
Wednesday morn:
plan to integrate the whole idea. find more about nv_float_buffer.. go beyond
man, just got the sobel, smooth and gaussian smooth to work. it works like magic. God is truly great!! just PRAY man.
Tuesday, June 24, 2003
just got the TEXTURE_RECTANGLE_NV program working with multitexturing, so we can have a POT texture and a NPOT texture and both work jussst fine :)
NV_float_buffer limitations
There are several significant limitations on the use of floating-point
color buffers. First, floating-point color buffers do not support frame
buffer blending. Second, floating-point texture maps do not support
mipmapping or any texture filtering other than NEAREST. Third,
floating-point texture maps must be 2D, and must use the
NV_texture_rectangle extension.
NV_float_buffer
This extension has many uses. Some possible uses include:
(1) Multi-pass algorithms with arbitrary intermediate results that
don't have to be artifically forced into the range [0,1]. In
addition, intermediate results can be written without having to
worry about out-of-range values.
(2) Deferred shading algorithms where an expensive fragment program is
executed only after depth testing is fully complete. Instead, a
simple program is executed, which stores the parameters necessary
to produce a final result. After the entire scene is rendered, a
second pass is executed over the entire frame buffer to execute
the complex fragment program using the results written to the
floating-point color buffer in the first pass. This will save the
cost of applying complex fragment programs to fragments that will
not appear in the final image.
(3) Use floating-point texture maps to evaluate functions with
arbitrary ranges. Arbitrary functions with a finite domain can be
approximated using a texture map holding sample results and
piecewise linear approximation.
The NV_fragment_program extension provides a general computational model that supports floating-point numbers constrained only by the precision of the underlying data types.
To enhance the extended range and precision available through fragment programs, this extension NV_float_buffer provides floating-point RGBA color buffers that can be used instead of conventional fixed-point RGBA color buffers.
A floating-point RGBA color buffer consists of one to four floating-point components stored in the 16- or 32-bit floating-point formats (fp16 or fp32) defined in the NV_half_float and NV_fragment_program extensions.
A floating-point color buffer can also be used as a texture map, either by reading back the contents and then using conventional TexImage calls, or by using the buffer directly via the ARB_render_texture extension.
tuesday:
- reading 'NV30 opengl extensions' - mark kilgard
- 'using p-buffers for off-screen rendering' - chris wynn
- pbuffer program to rotate a cube
monday
- addition of 2 textures by texturing a rectangle with 2 matrices and then obtain the texture through the framebuffer
- meeting with tony chen
- according to discussion, finding more about pbuffer, NV_float_buffer and TEXTURE_RECTANGLE_NV and binding pbuffer as a texture.
NV_half_float: provides support for 16-bit floating point representation throughout opengl
NV_float_buffer: IEEE 32-bit floating point components for textures and frame buffers
Monday, June 23, 2003
Retrieving Data from a Pbuffer
• Copy-to-Texture via “shared textures”
• Use wglShareLists( hVisibleGLRC, hPbufferGLRC )
• Allows sharing of ALL display list and texture
objects between rendering contexts.
• Call just once immediately after creating the Pbuffer.
• Don’t need if pbuffer uses same GLRC as app
window.
• Bind to pbuffer
• Render to pbuffer
• glCopyTexSubImage2D();
• Bind to on-screen rendering surface
• Render frame
Destroying a Pbuffer (In Windows)
• 3 Step Process
1. Delete the rendering context
2. Release the pbuffer’s device context
3. Destroy the pbuffer
wglDeleteContext( hpbufglrc );
wglReleasePbufferDCARB( hbuf, hbufdc );
wglDestroyPbufferARB( hbuf );
Binding a Pbuffer (In Windows)
wglMakeCurrent( hdc, hglrc );
• Makes the pbuffer device context the current rendering target for the rendering context.
• Subsequent OpenGL primitives rendered to the offscreen buffer.
Pbuffer Creation (In Windows)
• Quick Overview
1. Get a valid device context
HDC hdc = wglGetCurrentDC();
2. Choose a pixel format
Specify a set of minimum attributes
• Color, Depth, Stencil bits, etc.
• Can specify single- or double-buffered, just like a window.
• Will usually need only single buffer (save RAM!).
Then call wglChoosePixelFormat()
• Returns a list of formats which meet minimum requirements.
• fid = pick any format in the list.
3. Create the pbuffer
HPBUFFER hbuf = wglCreatePbufferARB( hdc, fid, w, h, attr );
“attr” is a list of other properties for your pbuffer.
4. Get the device context for the pbuffer
hdc = wglGetPbufferDCARB( hbuf );
5. Get a rendering context for the pbuffer:
• Either create a new one (pbuffer gets its own GL state!):
hglrc = wglCreateContext( hdc );
• Or use the current context:
hglrc = wglGetCurrentContext();
Using Pbuffers
Windows:
• WGL_ARB_pixel_format extension
• WGL_ARB_pbuffer extension
Three Key Components – same as for a window
• Creating a pbuffer
• Binding a pbuffer
• Destroying a pbuffer
For On-Screen rendering surface: buffer dimensions and bit properties are constrained by the current display mode.
For pbuffer rendering surface: dimensions and bit properties are independent of the current display mode.
glCopyTexSubImage is the best way to get a rendered image to a texture.
glTexParameteri( GL_TEXTURE_2D, GL_GENERATE_MIPMAP_SGIS, GL_TRUE ); is the best way to generate mipmaps.
Off-Screen Rendering with Pixel Buffers
• We don’t always want to use the frame buffer to
render our dynamic textures.
• Why not?
• Resolution is limited to the window resolution.
• Might need a different pixel format.
• Can require a lot of OpenGL state juggling.
• Overlapping windows can mess up copies.
• Can’t be used to render to texture (more later).
• Use a pbuffer instead!
