Reading time: 2 minutes
Did the media really got the point behind UltraSPARC T2?
Paul Murphy explores in The T2 and media reaction the question, if the media really got the point behind UltraSPARC T2. I´ve read many articles the last few days and i came to the same conclusion like Paul: A definite NO.
In most articles one or other point was completely missunderstood. And most of missundestandings seem to come from putting the x86 knowledge directly to UltraSPARC T2. But you can´t compare a CPU like an Opteron or a Xeon to a System-on-a-Chip like UltraSPARC T2.
Since the T2 architecture eliminates most of the reasons the Xeon has to waste most of its time on non productive tasks, equating one T2 processor set cycle (including floating point, cryptology, and network management co-processors) to one Xeon cycle is inappropriate.
Okay, the 89,1 GHz is a little bit far fetched, but the fundamental idea behind the number opens an interesting perspective . Many cycles are senseless, when most of them are wasted by waiting for something, and you can do more important things when you have you specialized circuits for some tedious tasks like encrypting or networking. For real world load you have divide the Xeon/Opterons cycles by effective IPC, or multiply them at UltraSPARC T1 to get a little bit more meaningful frequency. Kris of blog.koehntopp.de wrote in an article, that there are fundamental laws of physics that mandate a different view to software development, as software developer can´t hope to get an infinite increase of processing power. I agree to that, but i would add something: We shouldn´t research to move the border of technical feasability by millimeters, when we can win yards by making better usage of the existing technology. For example by getting more computing out of a given amount of clock cycles). This is the most fundamental point behind CMT, Niagara 1 was the first step, UltraSPARC T2 was the second, and many others will follow. Forget about raw cycles, forget about huge caches, forget about deep pipelines, all this things are just cures for symptoms, no real solutions. Well, you should think about one fact: there isn´t something like a x86 compatible core for years now. What we call x86 processors are high-end risc cores with highspeed translators from x86 to RISC. I don´t want to think about the level computing could have reached by now without this mediocre architecture of the seventies. Even Intel wanted to kill it several times. And which will haunt us at the least for the next two decades. It´s a strange twist of fate, that our dominant processor architecture for small and medium server was choosen by game developers and a finnish computer science student. At the end, this is the reason of the misunderstandings in the media. You have to throw away the x86 view to the world to understand the advantages of UltraSPARC T2. The media tend to measure technological advancements in relation to the x86. And this simply the dumbest way to measure.