jimmer wrote: As to the original post asking to comment on: "chips is where it's at, mobo's are obsolete."
I'm not sure I understand what this greg chap is on about in his blog. As I understand things, chips need interconnects - no? Mobo's are simply surfaces that support and implement interconnects - no?
If so, if you've got chips you'll have mobo's. Though it's unlikely that the classic PC _architecture_ will still be around in 2010.
On the other hand, wasn't the whole point of the Origin 3xxx architecture to make a system with upgradable interconnects? Even so, somewhere along the line, a 'chip' will have to be stuck onto some kind of connector-to-the-rest-of-the-system thing, and that's a Mobo in my book.
As to the original blog entry, the point was about the accelerated integration of more and more of the interconnects, subsystems, etc. onto one or a few pieces of silicon in order, partly, to keep up with the advancing speed of processors. But the key phrase was "add value".
squeen wrote: So I now add: "Where did SGI go wrong?"
Continuing this line, SGI stopped innovating in silicon. If all the component parts of the system are off-the-shelf devices, there's obviously a limit to what value you can add. As squeen points out, for instance, Prism's ATI gfx are a retrograde step in a market which is supposed to be moving forward, and never mind that SGI once owned the market.
For my own part, the "Where did SGI go wrong" is a little more complicated. I think they fell into a classic trap of not knowing what business they were actually in. All their early work involving gfx involved throwing large amounts of data around to support visual output, so they got very good at that. However they seem to have interpreted that as being in the business of high-bandwidth data processing when much of their market thought they were in visual computing. "Their" perception took them towards HPC, whilst many of their customers expected them to stay in visual computing.
Having bought Cray, they owned 2 iconic brands. They could easily have taken all of the gfx knowledge, kept it branded as Silicon Graphics, and been what nvidia are today. Or they could have extended the Indy approach and owned what is now Apple's market. Meanwhile who wouldn't want to have a Cray in their datacentre? The way they've gone they've just squandered decades of IP, hard work, development, testing and customer loyalty.
And from here? Well, they don't seem to be able to innovate at anything more than motherboard design (or bolting on RASC, or refining old interconnects) so where is the value they're adding? Right now they're good at making very large HPC machines, but all it will really take is for IBM say to decide to own that market for PR purposes and sell machines at no margin for that market to vanish.
Now, if somebody wants to port Darwin to Itanium then I'll happily pay big bucks for a "big iron" machine with an SGI cube on the front and MacOS X on the desktop to run some video editing jobs, even if it isn't the best thing out there