SGI: Computer Graphics

Through the Looking Glass - Page 1

I've was working on our in-house scene graph library last week. We weren't getting very good surface reflection up close with Cube maps, so I justed added a planar reflection render-to-texture (FBO) option. So far so good, but I'm still wrestling with reflecting all the light sources (and shadowmaps!). Still kinda cool. Runs real-time (2 lights, 2 reflection surfaces updated each frame) at around 30 fps with fragment lighting and 5x5 PCF shadows in a pixel shader (Quadro FX 5600). Gonna take a bit of wrangling to get it working on the Onxy.
Nice work! Will be interesting to see how it does on the Onyx :)

_________________
私のホバークラフト は鰻が一杯です。
IRIX Release 4.0.5 IP12 Version 06151813 System V
Copyright 1987-1992 Silicon Graphics, Inc.
All Rights Reserved.
squeen wrote:
so I justed added a planar reflection render-to-texture (FBO) option.


Cool; I just started learning about FBOs last week and was nicely surprised how much easier they are to use than p-buffers. I don't suppose IRIX has any support for them?

squeen wrote:
Gonna take a bit of wrangling to get it working on the Onxy.


Wasn't the traditional OpenGL method to use the stencil buffer and draw the scene and reflection separately?
squeen wrote:
I've was working on our in-house scene graph library last week. We weren't getting very good surface reflection up close with Cube maps, so I justed added a planar reflection render-to-texture (FBO) option. So far so good, but I'm still wrestling with reflecting all the light sources (and shadowmaps!).

No idea whether you could use it - but take a look at http://www.radiance-online.org/ anyhow - might just be the thing you are after.

_________________
You are not expected to understand this.
The Onyx will only have pbuffers and will suffer the hit of reading back (copying) to texture. Fortunately, IR is still very fast with pixel read. We'll see!

I've looked at Radiance a bunch of times, but we are going for hardware accelerated (real-time). Also, I have an obsession for writing my own applications as part of the learning process.

It's been a crappy month at work. Thanks for letting me share one of the lighter moments.
squeen wrote:
It's been a crappy month at work. Thanks for letting me share one of the lighter moments.


And thank you for sharing - I enjoy these peeks at your various graphics projects.

_________________
私のホバークラフト は鰻が一杯です。
IRIX Release 4.0.5 IP12 Version 06151813 System V
Copyright 1987-1992 Silicon Graphics, Inc.
All Rights Reserved.
squeen wrote:
I've looked at Radiance a bunch of times, but we are going for hardware accelerated (real-time). Also, I have an obsession for writing my own applications as part of the learning process.

Taken. But does that mean you did alot of Radiance scripting before?

_________________
You are not expected to understand this.
Hey, that looks really nice. If you can limit your scenario so it fits, it will be a good solution.

For all others only realtime raytracing will be the solution. But the requirements to code it yourself are completely different:

- SIMD coding (adieu readability)
- acceleration structures (balance between updatebility, memory usage and speed)
- fast materials (phong is easy, texture filtering sucks)
- reading and understanding all newest papers on the net (raytracing is still a very hot research topic)
- performant multithreading
- dealing with jerky compilers (all of them suck, each one has his special areas where it sucks most)
- clustering (8 cores is simply not enough for a typical 1280x1024 resolution)

Matthias

_________________
Life is what happens while we are making other plans
nekonoko wrote:
Nice work! Will be interesting to see how it does on the Onyx :)


indeed, looking forward to see this as well :D

_________________
r-a-c.de
Since there's nothing in the sample pic which couldn't be done with Radiance, I'd like to learn why you dismissed Radiance for your job.

_________________
You are not expected to understand this.
Oskar45 wrote:
Since there's nothing in the sample pic which couldn't be done with Radiance, I'd like to learn why you dismissed Radiance for your job.

I think he answered that - he didn't think Radiance would work real-time. After my quick glance at the site I didn't think it was really aimed at that either. But if you know someone running it at 30fps, that might be interesting to a lot of people. I didn't really see that in my look at their archives, but didn't try too hard either.
dc_v01 wrote:
Oskar45 wrote:
Since there's nothing in the sample pic which couldn't be done with Radiance, I'd like to learn why you dismissed Radiance for your job.
But if you know someone running it at 30fps, that might be interesting to a lot of people.

I myself can't answer that. The best way to find out about it would be to simply post to their mailing list. Greg Ward - the author of Radiance - is a quite helpful guy [I correspond with him occasionally since the first draft of his book], and I'm sure he would be more than willing to answer all concerns in this regard and supply hints for speeding up...

_________________
You are not expected to understand this.
I thought about talking to Greg Ward regarding getting real physics based lighting modeling into our system.
Yes, my scene graph uses OpenGL so it's hardware rendering. Radiance (I think) is pure CPU software which would be many times slower. With the advent of shaders, GPU radiosity is becoming a reality. Brombear, alludes to all the sticky areas in his post above (but radiosity != ray trace as I understand the terms).

When I started I wanted real-time interaction (which we have). With the pressure to produce more physically accurate lighting results (i.e. exactly what will a digital video camera see in a given environment), its hard to keep the frame rate up. But that struggle is the essential fun and challenge of it all, right?

Part of my aversion to Radiance is not it's capabilities. I have a pathological need to write my own software as a means of learning (and controlling) the intricacies of a problem.

Sometimes, something new and exciting pops out. Other times I just reinvent the wheel. Que Sera Sera. :)
You might be able to use Radiance to pre-compute the global illumination and make a bunch of ambient-light texture maps for your OpenGL render. Many video games use this technique; I think they call it "texture baking". The lighting is static but it looks quite realistic since you get bounce and more detailed shading.
In GPU Gems 2 this is called "ambient occlusion" and though it's really a hack (i.e. ambient is a fake out for real radiosity) it's next on the list. If I did as you suggest, it would be better--use a true radiosity application for precomputed true diffuse.

Here's another screenshot from the real-time app. This scene runs at only at 6 fps. (Does not have the new planar reflection maps in it yet, just bump maps, dynamic cube maps and shadow maps)

Attachment:
hst.jpg
No, but thanks. I'm an guidance system designer and analyst not an animator. The work I've done with graphics has been to support mission design, but the base models are typically shared across a number of people---and they usually require a lot of clean-up since it seems most applications are concerned primarily with import and pay little heed to a clean export.

The ray tracer is a testing tool I built (synthetic video) for a navigation experiment. I just can't seem to stop playing with it. :)
squeen wrote:
No, but thanks.


You're welcome, be my guest! ;) Your modelling/rendering looks superb!

squeen wrote:
I'm an guidance system designer and analyst not an animator.


Oh, sorry. After so many years here, I was always a bit "shy" to ask! :)

squeen wrote:
The work I've done with graphics has been to support mission design, but the base models are typically shared across a number of people


Sure, I understand. So, I guess your models are more oriented to a CAD type of accuracy than just visual animation models, since they probably must deal with real physical rules, reflect facts from missions, etc.

squeen wrote:
---and they usually require a lot of clean-up since it seems most applications are concerned primarily with import and pay little heed to a clean export.


Good point. I know what you mean. For example I discovered many weird things trying to share models between Blender and QCAD. Since many months ago I'm trying to improve my skills for some CAD modelling, but it seems I feel more comfortable with classic animation modellers than strictly CAD programs... more exactly, BlenderCAD... of course my projects/models are about thousand billion times simpler than yours! :)

But sometimes things gets more complex, and then BlenderCAD just does not cuts. It takes too much time to have a modelling done rispecting real scale, dimensions, etc. So I'm trying for now ArtOfIllusion, and it seems a very fast tool to work. Kinda of a kitchen sink to do real fast modelling oriented to obtain G-Code, or sharing with *.obj (Wavefront) capable apps. I've designed my first CNC Router by using BlenderCAD, but I'm doing my whole second machine with ArtOfIllusion, to then export/import the model to/from BlenderCAD for some fine touching. (In fact I'm actually trying to investigate a few things like: . ..How To Clean Up CAD Blue-Prints To Print Out? ...you're welcomed to help! :) )

squeen wrote:
The ray tracer is a testing tool I built (synthetic video) for a navigation experiment. I just can't seem to stop playing with it. :)


Well... you make the programmer in me to want to emerge and jump again to graphics! You know, long-long time since my latest affairs with OpenGL, MIPSPRO, etc. If I were you, I would chain my left leg to these... monster visualization center! :)
Have good fun. All the best,
Diego

_________________
Image
Octane / Dual Head

http://twitter.com/GeekTronixShop
I liked this write up (and no, that is not me in the picture). :)

http://www.hec.nasa.gov/news/features/2 ... 72809.html
Hi Squeen,

very cool ... good that they forgot to sum up how much energy has been used to calculate those images :mrgreen:

Are you continuing the hunt for faster rays and pixels ? You could try out that caustic thingy or play around with optix ...

Regards

Matthias

_________________
Life is what happens while we are making other plans
For the navigation algorithm I developed we are looking at hardware acceleration of the acquisition search, but for the path tracer the primary focus if physical accuracy not real-time (the super computer cluster sure comes in hand!). What I'd like to improve there near-term are some of the following:

1) spectral rendering (as opposed to RGB) including Laser Radar (LADAR) frequencies
2) bidirectional path tracing and energy redistribution (Metropolis-like) methods
3) adaptive sampling in image space
4) BRDF tables

I am considering a much simpler real-time ray tracer engine to replace raster graphics in the CAVE. We just ordered a 24-core SMP machine--I might tinker with that a bit as well. That's your forte and a whole other can-of-worms for me, but I think the industry is heading in that direction and we need to stay on the cutting edge.

Unfortunately, my current assigned project does not have a strong visualization requirement---it's all about multi-body dynamics---and it launches in 2014, so I don't know how much time I will have for high-quality graphics. We'll see....

Graphics for situational (simulation results) visualization continues to be useful and I think we will probably be making significant improvements in our modeling program (scene graph & path tracer front-end) as well. The CAVE work will continue as well, but it may not be by me.

The only recent addition to the path-tracer I've managed to squeeze in this spring was dielectrics (glass)
Attachment:
bunny-glass.jpg