What does 8-bit / 16-bit actually refer to?

  • When talking about retro games, terms like "8-bit music" or "16-bit graphics" often come up. I myself often use these terms, but I'm not exactly sure what they refer to. What do they mean?

    I realize that the examples I have given are two different contexts; I'd like to know both. :)

    It's funny that your 8 bit question is gaming.SE, honouring the legendary 8 bit Intel 8008 processor

    woah 1k views in 5min

    @ProSay that would be awesome, but the question is almost 1 year old

    oh damm i thought it was 5 minutes old some guy went to bump it

    @Pro I'm that some guy... :p

  • Grace Note

    Grace Note Correct answer

    10 years ago

    8-bit and 16-bit, for video games, specifically refers to the processors used in the console. The number references the size of the words of data used by each processor. The 8-bit generation of consoles (starting with Nintendo's Famicom, also called Nintendo Entertainment System) used 8-bit processors; the 16-bit generation (starting with NEC/Hudson's PC Engine, also called TurboGrafx-16) used a 16-bit graphics processor. This affects the quality and variety in the graphics and the music by affecting how much data can be used at once; Oak's answer details the specifics of graphics.

    If you don't know about a computer bit, then here is the Wikipedia article on bits: http://en.wikipedia.org/wiki/Bit, which I'll quote the first sentence that is all one really needs to know.

    A bit or binary digit is the basic unit of information in computing and telecommunications; it is the amount of information that can be stored by a digital device or other physical system that can usually exist in only two distinct states.

    Now, note that in modern times, things like "8-bit music" and "16-bit graphics" don't necessarily have anything to do with processors or data size, as most machinery doesn't run that small anymore. They may instead refer specifically to the style of music or graphics used in games during those generations, done as a homage to nostalgia. 8-bit music is the standard chiptune fare; the graphics were simplistic in terms of colour. 16-bit music is higher quality but often still has a distinct electronic feel, while the graphics got much more complex but still largely 2-dimensional and 240p resolution.

    An example of such "intentional retro": http://megaman.capcom.com/10/

    To give an idea of where we stand today, gaming consoles have used 64-bit processors since the Atari Jaguar and Nintendo 64. The XBox 360 sports 3 64-bit processors. 64-bit PC processors are finally popular (you will see Windows Seven 64-bit version, for example).

    Specifically, it's the size of the accumulator register. However, don't rely on this number to tell you much - 90% of programs will see little-to-no benefit jumping from a 32- to a 64-bit processor. The exceptions are programs which must do complex calculations on large sets of data, such as video encoding.

    Yet another amazing answer by Grace Note. :)

    The TurboGrafix-16 was named the PC Engine in Japan. Also, The Atari 5200, Colecovision, and Vectrex used 8-bit processors and were released before the Famicom.

    @kirk For one reason or another, the "8-bit generation" doesn't start with the intro of 8-bit processors. It probably was a retro-active name: the 16-bit generation was defined by it, but the previous generation heralded by the Famicom was typically considered separate from the Atari 5200 generation. So in the 16-bit generation was named the 16-bit, they simply called the previous one 8-bit at the cost of accuracy.

    Concerning graphics, I extended Oak's answer to resolution and input devices. Concerning sound, 16 bits/sample is also the CD standard, but I guess it refers to using Tracker music. But I'm not sure what limitations 8 bit vs 16 bit put there...

License under CC-BY-SA with attribution


Content dated before 6/26/2020 9:53 AM