Apr 15th, 2010, 11:49 AM
Are modern consoles more glitch prone?
Rob's latest post put the mental wheels in gear; are the newer systems more prone to glitches than the older generation systems, or are we just more aware of them now thanks to the internet? Or, to throw another wrench in the fire, is it the complexity of the games themselves that are responsible for the increased number of glitches?
I think you hit it on the head with the second part of your post. As the complexity of games increase, so does the potential for glitches. Imagine the amount of testing that has to go into a game like God of War III on the PS3 versus the amount of testing that went into a game like Pitfall on the Atari 2600. There's obviously a lot more that can break with games that huge.
For the classic style web games I make with my chums, we usually only need 1-2 people testing them for glitches. For huge multi-million dollar games like GoW3, they need big teams of people constantly testing them, and even then, glitches still sneak on through as seen in my latest video:
But it's a law, somewhere, in physics that the propensity for a system to break down varies directly with the complexity of that system.
Simply put: more moving parts = more opportunities for things to go wrong.
Which is why my Super Nintendo still works and my first Playstation does not.
Some would say the same rationale can be applied to software, hence all the glitchiness. Not that I didn't have my fair share of glitches on the Gameboy. But they never had many problems would animations or missing textures or object collision, did they?
Originally Posted by Jixby Phillips
Oh god fathom zero, you are revealing yourself to be completely awful
i was talkin about this somewhere earlier but i forget. as games get more complex and the lifespan of a console shortens between generations, we're starting to see the negative effect of a 2year+ development cycle. this stuff has to be shoved out the door with a certain number of what is deemed to be acceptable flaws if they want to keep the flow of $dollarsigns. i would expect this to be the norm from here on out. kinda makes console gaming more of a bummer but i've mostly given up on it anyway.
The games may be graphically more complicated, but a lot of how things are done haven't changed. Where once a line of code told the system to display, say, a red square, it can now tell it to display a pre-determined item that is far more complex. Like an object that has adjustments done to it based on lighting, distance, and orientation to the viewer.
I think it's more a matter of moving parts and laziness.
The moving parts wear out. Now that technology is pushing so much through a circuit, you have to cool those components. Fans wear out and heat breaks down the epoxy and plastic that everything is built on. Let's not forget about the spinning disks. My Atari had ZERO moving parts. 34 years later I can still play Yar's Revenge. The same holds true for my Nintendo, Genesis...all the way to the Nintendo 64.
The Playstation is where things started to go south with the introduction of spinning parts. It just got worse when they added the cooling fans. It has reached it's peak with the online capabilities.
Back when cartridges were sold, a company knew that whatever mistakes they missed would be eternal. They also saw how this could make or break there ability to sell the next product (E.T. anyone?). Now that they can patch a game by forcing you to log on, they don't care about quality control so much. They know that a game like God of War III can be fixed on the next system/ software patch. I think this is what has led to some shoddy programming, missed code that can cause a system crash, and general mayhem.
Call me nostalgic, but I miss the day I could fall asleep with my Nintendo on, not notice the red light for several days, then turn on the TV and start playing again.