go dex. go!
The collected works of squeen - Page 9
What was the fix?
Hmm. I'll have to keep on eye on ours.
Thanks for sharing that.
Thanks for sharing that.
Less than 4 minutes at that resolution with only 4 cores! That's incredible. I honestly don't know how you control the noise so well with so few samples. The motion blur looks great too.
Oh my mistake. 8 cores++.
Linux?
Brombear wrote:
I really would like to run our software on your cluster though ...
Linux?
The HPC cluster is 64bit Linux and the network is Infiniband and a shared filesystem. I'm not sure if they have the multicast set up, but for our render when we request work-cores from the queue, we get an environment variable that points to a file that lists the IP of all the node we are authorized to use for the next 12 hours. I'm usually able to check-out 1000-2000 cores without any conflicts since I think they have grown the cluster to well over 10,000 by now.
None of us use Windows boxes, but maybe we could borrow one.
Let talk more about it this summer after I've finished my project's design review.
None of us use Windows boxes, but maybe we could borrow one.
Let talk more about it this summer after I've finished my project's design review.
My guess is also the reflection limit. Try something higher like 10-12 to rule that out.
Although longer paths will gather more radiance, the excessive brightness look non-physical. Does the metallic reflector have a reflective coefficient below 1 (it should) or is there a light source in there?
Very nice. Looks good.
I thought on all the Onyx you can use ircombine for set up video.
Could also try
Grace
.
There's a SGI freeware version for IRIX, but I don't think we ever added it to nekoware.
There's a SGI freeware version for IRIX, but I don't think we ever added it to nekoware.
Looks cool. Can't wait to try it.
Wow. NURBS!
Have you started looking at the DX11 class graphics hardware's ability to tessellate (NURBS?)on-the-fly in the vertex shader? Perhaps that's what you are already using.
I have to admit, it is nice to finally see it make it to hardware.
For graphics, we live in interesting times!
I have to admit, it is nice to finally see it make it to hardware.
For graphics, we live in interesting times!
Of course. My mistake. Sounds like it's not a serious bottle-neck.
On the Onxy with default gamma, the spheres are a bit too white. Is there a command-line option for gamma correction?
Also, have you thought of submitting this to Xscreensaver?
Also, have you thought of submitting this to Xscreensaver?
I made a small update to the
Time Wiki page
. If everyone agrees with my correction, we can modify the original text which seems to be instructions for using the SGI Freeware version.
A heterogeneous system is always more robust (and probably more capable as well).
Good for Google!
Good for Google!
Yes. But most SGI employees used Windoze. I never thought it was right. How do you know where your product is deficient if you never use it?
Paul Graham wrote:
Historically, languages designed for other people to use have been bad: Cobol, PL/I, Pascal, Ada, C++.
The good languages have been those that were designed for their own creators: C, Perl, Smalltalk, Lisp.
The good languages have been those that were designed for their own creators: C, Perl, Smalltalk, Lisp.
Nice. More info on how it was generated please!
I think that may be what the problem is. I'll keep tinkering and report on the fix.
hamei: what cad program is that you are using on IRIX?
Perhaps someone would like to come up with a generic library redirection (spoofing) for the XkbSelectEventDetails function (via LD_PRELOAD) by making a simple libX11.so in the preload path. Which is what spd has done with xkb.so, but would be a general fix without recompile. The only thing I'm sure sure about is if wrappers for the other libX11 functions would also be required.
A FLOSS option is
ffmpeg
. Its fast, supports a ton of codecs, has few dependencies and is even in Nekoware.
Nice hearing from you Dex. Hope you feel better (and get a brilliant HPC job!).
I agree. Computer-driven devices have lost the "fringe" niche they once owned. They are like a toaster or VCR. Easy to use. Good an doing certain tasks, but less programmable without specialized tools---not really for "computing" in the sense of number crunching for most folks.
For a short while (1990-2010?), everybody wanted/needed faster processors/graphics and engineers and scientists benefited from the consumer-driven demand. It may be that CPU's are getting close to "fast enough" for the casual consumer (i.e. real-time video rates) although there probably is some more room for "smaller" and "prettier".
Unfortunately as number crunchers, CPUs/GPUs are still at least two-three orders of magnitude below for what is needed engineering simulation. If scientific vs. consumer demands go off in different directions, it will be high-performance computing users that will be left hurting for funds.
Ironically, whatever small HPC market that remains could use probably a company like what SGI once was.
Back on topic. I think we all owe Steve Jobs a big thank you for helping making computers sexier, more powerful, smaller and easier to use. He was the enabler that got the new technologies into all our hands. Forget the iPhone, think about:
- the 1st personal computer (Apple ][)
- the 1st mouse on a PC
- the 1st GUI on a PC
- the 1st laser printer on a PC
- the 1st MS Office (Word for Mac in 1984, six years before Windows)
- the 1st compositing desktop (OS X)
- the 1st on-line music store
Not to mention a zealous love of miniaturization and elegance of design.
I don't know if we all realize how much we will miss his relentless drive in the market, even if us "purist of computing" feel as though something special and secret was lost when Jobs brought the screaming masses to geek-sheik. Oh well, you can't have it both ways.
I also loved him as the counter-point to the heavy-handed, drab, and clumsy Microsoft-monopoly.
Now the small-minded and greedy can see how long it takes them to gnaw away at the empire Jobs built.
For a short while (1990-2010?), everybody wanted/needed faster processors/graphics and engineers and scientists benefited from the consumer-driven demand. It may be that CPU's are getting close to "fast enough" for the casual consumer (i.e. real-time video rates) although there probably is some more room for "smaller" and "prettier".
Unfortunately as number crunchers, CPUs/GPUs are still at least two-three orders of magnitude below for what is needed engineering simulation. If scientific vs. consumer demands go off in different directions, it will be high-performance computing users that will be left hurting for funds.
Ironically, whatever small HPC market that remains could use probably a company like what SGI once was.
Back on topic. I think we all owe Steve Jobs a big thank you for helping making computers sexier, more powerful, smaller and easier to use. He was the enabler that got the new technologies into all our hands. Forget the iPhone, think about:
- the 1st personal computer (Apple ][)
- the 1st mouse on a PC
- the 1st GUI on a PC
- the 1st laser printer on a PC
- the 1st MS Office (Word for Mac in 1984, six years before Windows)
- the 1st compositing desktop (OS X)
- the 1st on-line music store
Not to mention a zealous love of miniaturization and elegance of design.
I don't know if we all realize how much we will miss his relentless drive in the market, even if us "purist of computing" feel as though something special and secret was lost when Jobs brought the screaming masses to geek-sheik. Oh well, you can't have it both ways.
I also loved him as the counter-point to the heavy-handed, drab, and clumsy Microsoft-monopoly.
Now the small-minded and greedy can see how long it takes them to gnaw away at the empire Jobs built.
Cool!
I guess I would add to the list:
- 1st touch screen on a phone
- mag-safe plug on a laptop (cause I love it!)
I'm sure the NeXt cube had some other firsts I don't know about.
Anyway, what an innovator!
- 1st touch screen on a phone
- mag-safe plug on a laptop (cause I love it!)
I'm sure the NeXt cube had some other firsts I don't know about.
Anyway, what an innovator!
Great programmer. Clear thinker. In all his public writing, seemed a humble, down-to-earth fellow. Definitely an inspiration and an example of the of a single person having a huge impact on society.
I love C.
I love Unix.
So long, Dennie Ritchie and thanks.
I love C.
I love Unix.
So long, Dennie Ritchie and thanks.
I think in almost all the "1st"'s there was a true (forgotten) processor. He was just quick to adopt, (expertly) refine and market. Almost never did Apple play "catch-up". It was a real talent.
I think T.S.Elliot once said, "Good poets borrow, great poets steal."
I think T.S.Elliot once said, "Good poets borrow, great poets steal."
Looks pretty good. Thanks.
I just googled my own post to try and remember how I did this.
We bought an Onyx4 with ATI crap graphics. It was a mess. We later retrofitted the machine to IR4---a huge imporvement.
Whee! I haven't had a dose of hamei rants in a while. Glad I checked this thread.
For me, there seems to be a potential divergence in computing right now. The consumer market and the scientific market. For the average consumer, sleek design (form-factor) and ease of use (OS) is king since the computing power to read email, write documents and (yes) make spreadsheets is minimal since the user-in-the-loop is the real limiting factor.
On the scientific/engineering side, computing simulations are still several orders of magnitude too slow. Even with a billion dollars, if your task is not readily parallelizable, you are screwed. No computer on Earth is fast enough for a full-fidelity model unless you are willing to wait weeks for a result.
With graphics it is similar. Desktop icons and eye candy---no problem. Synthesizing physically accurate light-transport imagery at real-time rates in still problematic to say the least.
If scientific computing was driving the market, we would have 128-bit quad-precision math in hardware (e.g. Itantium) a decade before extra long-life batteries.
What was in the 1990's a nice confluence between the consumer's and engineer's desire for faster CPUs, seems to be diverging now. The lastest "advancements" (e.g. hyper-threading) are starting to seem useless for number-crunching.
My current fear is that Apple will drop the Unix-based OS X on their laptop for an iOS-based one and I won't be able to even run the engineering software I develop on a Mac any more. Although, to be honest, it has never performed quite as well on a Mac as it does on a Linux workstation---something is weird/slow about how they allocate/free memory and the OpenGL graphics drivers have definely lagged the NVIDIA ones for Linux.
For me, there seems to be a potential divergence in computing right now. The consumer market and the scientific market. For the average consumer, sleek design (form-factor) and ease of use (OS) is king since the computing power to read email, write documents and (yes) make spreadsheets is minimal since the user-in-the-loop is the real limiting factor.
On the scientific/engineering side, computing simulations are still several orders of magnitude too slow. Even with a billion dollars, if your task is not readily parallelizable, you are screwed. No computer on Earth is fast enough for a full-fidelity model unless you are willing to wait weeks for a result.
With graphics it is similar. Desktop icons and eye candy---no problem. Synthesizing physically accurate light-transport imagery at real-time rates in still problematic to say the least.
If scientific computing was driving the market, we would have 128-bit quad-precision math in hardware (e.g. Itantium) a decade before extra long-life batteries.
What was in the 1990's a nice confluence between the consumer's and engineer's desire for faster CPUs, seems to be diverging now. The lastest "advancements" (e.g. hyper-threading) are starting to seem useless for number-crunching.
My current fear is that Apple will drop the Unix-based OS X on their laptop for an iOS-based one and I won't be able to even run the engineering software I develop on a Mac any more. Although, to be honest, it has never performed quite as well on a Mac as it does on a Linux workstation---something is weird/slow about how they allocate/free memory and the OpenGL graphics drivers have definely lagged the NVIDIA ones for Linux.
Cool. Thanks.
Just got the power resolved in the lab and the Onyx4 back on line!
I need to get my nekoware packages updated and then I'll give it a try.
Thanks!
I need to get my nekoware packages updated and then I'll give it a try.
Thanks!
I always enjoyed that background. Thanks.
Looks cool. Hi Hamei!
These days your best bet for IRIX would be to use and/or update the nekoware Octave package.
Cheers.
Cheers.