Well well – in pursuit of Quartermann’s column, I accidentally discovered that EGM‘s own editor at the time, Ed Semrad, had a few things to say about Nintendo’s hotly-anticipated new system, right on the first page of the October 1993 issue:
The president of Nintendo of Japan announced at the Shoshinkai Nintendo Show that they will be bringing out a new 64-Bit game machine. This system, he said, would be the ultimate video game console. And that it would. With specs like 100 MHz clock speed and HDTV compatibility, their ‘Project Reality’ would exceed anything ever dreamed possible. The best news would be the price tag…only about $250! After the ohhs and ahhs from the audience subsided (including myself), reality set in. I said to our people at the show, ‘Wait a minute, this is just Nintendo talking again.’
This skepticism was certainly not unfounded in 1993, hot on the heels as it is from the great disappointment of not only Nintendo’s relationship with Sony dissolving, but also the grim reality of the Phillips CD-i (which never did have a way to connect to the SNES, but it did at some point get a controller based on the Gravis PC Gamepad, which many folks say was a bit more than “inspired” by the SNES controller). 100 MHz in 1993? Unbelievable, when Doom wasn’t even out yet and it only really needed 33 of those to turn a playable framerate.
But here’s the thing: Nintendo weren’t exactly talking out of their asses. Not only does the Nintendo 64 run a near-100 MHz main CPU provided by NEC, it also had a Silicon Graphics-developed video processor that was at least two-thirds of that speed and had features that the PlayStation and Saturn could only dream of. Somehow, even this level of hardware brute strength wouldn’t be quite enough to run some games smoothly (Perfect Dark being the most infamous example), but when you look at the rest of the N64’s hardware (or in some cases, lack of it), as well as the circumstances around its release, you start to realize how a machine this strong could cost only $250.
The first cost-reduction “tactic” – if you can call it that – is that the Nintendo 64 wouldn’t launch for another three years, in 1996, almost two years after the rest of the pack had already started establishing the 32-bit market. Even ignoring the fizzled spells of the 3DO (which was days away from launch as of the magazine’s press time), the Phillips CD-i, and the Atari Jaguar and its indisputably awful CD add-on, you still had the PlayStation and Saturn, each with a reasonably strong lineup of genre-definers. Nintendo could arguably have put their console out sooner, but this would have meant they’d have to rush a game out the door with it, likely launch at a higher price point, and wind up disappointing everybody.
Secondly, the Nintendo 64 only had 4 megabytes of RAM. This could arguably be considered its first major weak point, because even though they could pull a Neo Geo by “instantly” addressing the data they need from the ROM without having to wait for a disc drive to seek and read what they need, 4 MB still had to go towards keeping frequently used sound effects and program data in memory, which simply wasn’t enough past the first year or so worth of games, and even addressing things from ROM could only go so far, when ROM was still very expensive to manufacture. (Even stuff like Super Mario 64 – a cartridge containing less than 8 megabytes worth of data – would “cut corners” here and there with undersized textures stretched to giant proportions, and Gouraud shading in place of textures on characters.) Nintendo ultimately released the Memory Expansion Pak a few years later, boosting it to 8 MB, but even this didn’t quite seem like enough, when PC games of the era were demanding 32 or even 64 MB.
Finally, the real “elephant in the room” with the Nintendo 64’s hardware: the machine just plain didn’t have a sound processor. Nintendo’s broken friendship with Sony meant they couldn’t just reuse the legendary S-SMP hardware from the SNES, which meant that anything the Nintendo 64 wanted to do with sound had to be handled directly by the main CPU. Factor 5, who helped develop the N64’s audio development tools, at some point noted that (especially in regards to Star Wars: Rogue Squadron) they constantly had to trade-off between sound quality and AI processing, because beyond the game logic that already needed processing time, they still needed to allocate a sizable chunk of CPU time to keeping the high-quality sound and music (arguably an essential part of the Star Wars experience) going. If you’ve ever wondered why third-party N64 games come out with “muddy” sound effects, that’s probably a big reason why.
We also can’t forget the kind of business logic that keeps the big console-makers in business: sell the machine at a loss, make the money back from game sales. Because you can only sell the console once per person, but that one console opens up sales for as many games as you can put out. Especially in Nintendo’s case, since (as with the NES) they more or less controlled the manufacture of Nintendo 64 cartridges and would always take a cut of the sales.
Suddenly, $250 (in 1996, mind you) for a 100 MHz machine with Silicon Graphics video hardware doesn’t sound so far fetched.
But wait a second. What was that bit about HDTV? Let’s look at that again:
With specs like 100 MHz clock speed and HDTV compatibility[…]
HDTV in 1993? “It’s more likely than you think,” a sage advertisement banner once said. Truth be told, there have been a lot of “high definition television” standards over the course of television history, some of them dating back as far as the 1930s when television was considered nothing more than a novelty. What Ed’s most likely to be talking about, though, was a so-called “Grand Alliance” that convened not long into 1993, seeking to replace the largely-incompatible NTSC, PAL, and SECAM standards that had been in place for just under half a century. The big problem was not so much getting everybody to agree on a standard as it was allocating bandwidth; in the 1990s, there were so many TV channels and networks that they were fighting over signal. If I’m not mistaken, this is what gave rise to the popularity of cable and satellite TV providers, whose services could carry a lot more bandwidth than the arguably more popular over-the-air TV signal.
But that’s not exactly the point. HDTV was as much a video display standard as it was a video transmission standard. Keeping transmissions small and quick required some thought into video compression, specifically MPEG-2 according to IEEE Spectrum’s article on it from 1995, especially when the standard called for the ability to transmit “more than two million picture elements” (that’s 2.0 megapixels, or what we would nowadays call 1080p). Sure, the actual receiving and display equipment was still in prototype stages in 1993, but that didn’t mean that companies weren’t already interested in getting involved in designing around them. Hell, to some extent there were even entire video games designed for this new HDTV standard; the earliest known one (last I checked) is Hi-Ten Bomberman, developed jointly between Hudson Soft and NEC, supporting ten players and running on a combination of two PC Engines and an as-yet-unknown computer hardware setup “behind the curtain.” Hi-Ten Bomberman was only ever displayed once, at Hudson’s Summer Carnival ’93, and footage of it is scarce at best. Would Nintendo be setting out to be the first console to natively support HDTV without the hardware-crate of Bomberman? Not likely, when the majority of consumers wouldn’t own a high-definition set until around 2005, when availability became high and prices became much more palatable.
Historically speaking, the first video game console to support any part of the HDTV standard was PlayStation 2, which had a separately-sold RGB Component video cable that allowed the machine to put out at 720×480, with progressive-scan modes not necessarily supported by every game in the library, and only four PS2 games that would dare support 1080i: Gran Turismo 4, Tourist Trophy, Valkyrie Profile: Silmeria, and…uh, Jackass: The Game. Nintendo were a little better at adding progressive-scan support to their GameCube games, with only a very small handful not supporting some kind of option for it (Metroid Prime is reputed to look excellent at 480p, assuming you were ever able to find the cable to display it), but it would be Microsoft’s XBOX that would be the first console to commit to “true” high definition: the XBOX’s component video cable allowed it to display at 1280×720 progressive-scan and 1920×1080 interlaced settings, which sounded nice on paper, but in truth, just shy of 50 games ever released for the XBOX would even bother trying to run at 720p (mostly fighting and sports games…and Freedom Fighters), and only around six of those would go all the way up to 1080i (including disappointing movie tie-in Enter The Matrix and disappointing arcade revival Dragon’s Lair 3D).
And, depending on who you ask, the video game consoles of 2016 still struggle to run games at a native 1080p…but that’s an argument I’m not interested in getting into.