Post by nocturnal YL on Apr 15, 2005 14:14:05 GMT -5
This is what I saw on another thread.
No. But, generaly almost nothing uses more then 32-bit anyway becouse anything above that is completely useless.
Then I started to wonder why is Gamecube is so-called 128-bit... When you talk about colours, true, 32-bit colour (=4294967296 colours) are enough.
Also thing about console CPUs and PC CPUs. What is the relation between them?
Say, N64 is a 64-bit console, but its power is far lower than a 32-bit PC CPU (I'm only referring to CPUs above 486). Then I assume that the bit number of a PC and a console are not defined by the same critirion.
I've got another question before my head explode. What is the bit number of a console referring to? I don't think it referrs to the CPU...
"bit" refers to the length of binary code the processor can process at any one time. Longer code means more can be produced from that code. But the rest of the system is just as important in making good use of it.
As you can see, various 16-bit system make quite different use of the code. The Intellivision couldn't run Donkey Kong Country, for instance. Hell, the Sega Mega Drive (Genesis) couldn't run it!
If it's 32-bit then that refers to the length of code that can be read by the processor. What you have to remember is that the rest of the machine is geared to make the most of that code, cutting down on loading times, while simultaneously providing top of the range graphics, sound and level build.
I mean, the Gamecube is obviously not overpowered by the N64, but like has been said, its processor doesn't read code as long. The PS2, similarly, is graphically inferior to the Gamecube.
I wish someone with more techie knowhow would completely fill us in on what I'm talking about, because I'm not 100% sure myself.