unixmuseum wrote:
Yes they are... 32-bit Linux needs a recompile of the kernel to address more than 800MB, but it works a little better than on Windows...
I've seen other people say that as well. Yet, I have three linux machines here, one with 1 gig, and 2 with 2 gigs, and I don't recall doing anything to make them see and use the entire ram. One of the machines is using a stock RHEL 3 kernel, one is using the stock RH9 kernel (because I haven't changed it yet), and one is running a custom compiled kernel with the bigphysarea patch (but I don't recall doing anything fancy in the configuration).
Is the real issue that a single application can't use more than 800 megs? Even still, on the RHEL box I've run a program that used 900 megs.
unixmuseum wrote:
What's not running at all or not very well on Linux are the major pre/post processors... Patran runs like shite on Linux, FEMAP and I-DEAS don't run at all. These three account for about 85%-90% of the FEA pre/post software... One of the biggest issues with Linux is getting consitent OGL performance and quality. These tools heavily depend on OGL.
OpenGL performance on linux is a major agrevation. Everytime I'm doing it on linux I wish I was back on my Octane at home. And OpenGL is one of the major issues keeping me from being whole hearted about moving most of my work at home to OS X.