|
Post by mrmolecule on Aug 14, 2008 20:18:14 GMT -5
I am a bit disappointed about the loss of bits in video games. Everyone knows the NES was 8-bit, the SNES was 16, the N64 was 64 (duh), and the GCN was 128. I guess it was because it was too confusing. For example, both the Intellivision and the SNES were 16 bits yet in terms of power, the Intellivision was a stink compared to the mighty SNES. Similarly, the GBA couldn't take a PS1-esque graphic qualities. Meanwhile, the only song-and-dance I heard for the DS was "between the N64 and the Revolution", while the Wii...well, it wasn't 256-bit, I don't think.
I vaguely remember there being a thread like this elsewhere, but I can't find it.
Can anyone explain how much "bits" the DS and Wii have?
|
|
|
Post by Johans Nidorino on Aug 15, 2008 0:23:24 GMT -5
The Nintendo DS has two processors, both of which are 32-bit.
Nowadays, how many bits form the instructions that a processor can handle don't affect much how good a video game system is; it's more dependent now on the operating frequency of the CPU and the GPU, as well as RAM.
|
|
|
Post by nocturnal YL on Aug 15, 2008 3:07:04 GMT -5
IIRC the GameCube is 32-bit. This means Wii is, although there's no official documentation supporting it, probably also 32-bit. (Who would use a 128-bit processor in an affordable console, anyway?)
And Johans explained it pretty well. For one, word size and clock speed isnt the same thing; the former affects how many bits can be processed "at the same time" the latter determines how many megabytes can be processed within one second.
Just think about emulators for PCs and Macs. You can emulate a 64-bit system with a 32-bit CPU, right?
|
|
|
Post by Fryguy64 on Aug 15, 2008 8:51:06 GMT -5
Talking about "bits" was just a way for the systems to challenge each other aaaages ago. Back in those days, more bits clearly meant better games!!
Of course, it's always been meaningless. The Sega Genesis and Super NES were both 16-bit consoles, but they differed significantly in key areas that affected what games could be made for them.
These days, people are too tech-savvy to fall for this con. Saying a system is 32-bit or 64-bit or 128-bit or whatever means nothing. And Nintendo's pretty much leading the way in proving that more "bits" doesn't mean better games. They're using a souped-up last-gen console to take on systems many, many times more powerful than it - and winning the sales battle (if not necessarily the battle for hearts and minds... at least they're doing better than Sony there as well!)
|
|
Swedol
Bubbles
How Could I Win Here, Where Fools Can Be Kings
Posts: 592
|
Post by Swedol on Aug 17, 2008 6:59:36 GMT -5
Wasn't Intellivision 16-bit?
|
|
|
Post by parrothead on Aug 17, 2008 12:22:37 GMT -5
I believe the seventh generation consoles XBOX 360, Wii, PlayStation 3 could be the 256-bit era when you look at it. Okay the CPU in the XBOX 360 and Wii are 64-Bit while the PlayStation 3's Cell is 128-bit, but the graphics card in all three consoles has an internal 256-bit engine and the graphics is really what counts, not processing units.
|
|
|
Post by Koopaul on Aug 17, 2008 15:25:34 GMT -5
I don't think anyone was implying that more Bits means better games, I think Mr. Molecule was simply curious as to how many bits were in those systems.
|
|
|
Post by mrmolecule on Aug 17, 2008 20:08:56 GMT -5
I don't think anyone was implying that more Bits means better games, I think Mr. Molecule was simply curious as to how many bits were in those systems. Exactly. I was wondering if the Wii still followed the bit trend and be "128-bit".
|
|
|
Post by missingno.is back? on Aug 24, 2008 22:26:28 GMT -5
Wasn't Intellivision 16-bit? yes that was stated in the first post. They used to have a slogan for it saying "It's the closest thing to the real thing!" due to its superior graphics to the Atari 2600.
|
|