<$BlogRSDUrl$>

A technology blog for The Economist Group IT team

Thursday, June 16, 2005

Apples for apples?

Or what's the difference between a VAX and an Alpha?

It went largely un-noticed when Apple announced recently that it would be switching to Intel chips that this could be one more nail in the coffin for RISC chips. The concept of a reduced intruction set computer was first put forward by John Cocke of IBM in the 1970s (Cray built systems that weren't known as RISC at the time but certainly had reduced instruction sets - the term RISC was later coined by David Patterson at the University of California in Berkeley). Cocke argued that computers only used 20% of the instructions built into the processor and that one with fewer instructions built in would be cheaper to manufacture and that it would get more done in a shorter time as each instruction would take the same amount of time to execute. Why was this the case? Bceause the processing of each instruction would be "pipelined" so that as soon as one had been executed execution of the next could start. This was not the case with a RISC processor where some instructions take longer to process than others. It was also apparant that the people that designed the processors hadn't optimised all the instructions and it was possible to perform some tasks quicker by breaking them down into their individual components.

The RISC concept was deployed in the first IBM PC/XTs and went on to be used in SUN Sparc chips and DEC Alphas as well as the Motorola 68000 and the PowerPC. The less frequently used 80% of instructions on a CISC chip (coined after the term RISC to highlight the difference) undertake complex tasks that match high level programming constructs such as "increment register by one and branch if zero". Having such instructions built into the processor meant that the code was smaller, took up less memory and ran faster. As memory became less expensive, this became less important and so a RISC chip could do the smae thing with the programmer writing multiple lines of code to perform what might take one line on a CISC chip. And so in the 1990s DEC regularly set the benchmark for clock speed (cycles and therefore analagous to intructions per second) with it's Alpha. I recall the veritable excitement at taking delivery of one of the first 333MHz machines in the UK at a time when the Pentium was clocking 100Mhz! The Alpha was a 64 bit processor too.

At the time RISC was seen as the future of computing, but the increasing dominance of the "Wintel" (Windows on Intel) platform at both eworkstation and server level has since nipped that vision in the bud. The Alpha architecture remains in Intel's Itanium, but it's future doesn't look that rosy with Intel compromising to ensure backward compatability with 32 bit apps.

The original Pentium chip was Intel's first stab at packaging a CISC processor in a RISC wrapper. What this means is that although the Pentium is a CISC processor at it's heart, it actually executes some complex instructions by using multiple simple instructions and performance is increased by the use of more than one instruction pipeline.

So is this the end for RISC? Not quite. Mobile 'phones, PDAs and game consoles (such as the new PSP) use RISC processors, so who knows?
Comments:
What's also worrying is the support headache this is going to create for a mixed-environment of Macs running the "same" OS and Apps, but on different chip sets. I wonder how many companies will see this as too much pain and effort so bite the bullet and ditch their beloved Macs in favour of PCs? Could this be the end of the desktop religious wars and Microsoft's total domination of the desktop?
 
Post a Comment

This page is powered by Blogger. Isn't yours?