kramlq wrote:
I sort of see where SAQ is coming from though. As you correctly point out, x86/x64 may be the undisputed standard nowadays, and it is good enough for what much of the market wants. But it is also undoubtedly ugly. For example, look at how much startup code is required for the x86 architecture to get to start_kernel() in Linux (for example, compared with Alpha, which SAQ cites as a good design). A lot of it is caused by the continual addition of new features and the need to retain compatibility with old ones in x86/x64.
Right, but this particular property isn't even a product of "x86" as an ISA or any of the many x86-implementing microarchitectures. This is just a product of PC motherboard and CPU vendors trying to maintain compatibility with existing operating systems. I could create a boot ROM for a board with an x86 CPU that did all the bringup that's normally before start_kernel() in Linux, and I would have created the equivalent of the bringup portion of PALcode on Alpha, and you'd never have to see it.
kramlq wrote:
Virtualisation/emulation and binary translation have been an interesting step forward in this area. The traditional dependence on backward compatibility and ISA is becoming less relevant as these technologies develop further. Apple's transition from PPC to x86 was a good example.
I think the biggest example in this direction is LLVM, where there's a defined, optimizable, and open processor-independant intermediate language sitting between the compiler and the optimization and assembly process. On the other hand LLVM is still very much research-grade (despite Apple's containing obsession with using it for production software and standards like OpenCL).
_________________
<>