Friday, September 2, 2016

Frame Rate Issues: 30 fps vs. 60 fps

There are plenty of debates among gamers and the gaming community. If you are familiar with gaming, you are probably well aware of the existence of the following debates:

  1. Console Wars
  2. Console vs. PC
  3. Who is Best Pony
  4. PC Master Race
  5. Single Player vs. Multiplayer
  6. Frame Rate vs. Graphics
  7. Long Games vs. Short Games


Pretty much all those discussions are pretty common among gamers and have also been around for awhile. Except for number 3, that one was just a joke to see if you were paying attention. One of these discussions, Frame Rate vs. Graphics and a general discussion on Frame Rates is the topic of today's post.

Now, some have argued in defense of the 30 fps frame rate that a lot of console games tend to get but so far, the only real attempts at a defense I have seen consist of either accusing someone of being a console hating PC elitist, or they offer a weak defense about how they prefer graphics and will go for half the ideal frame rate because they want things to look pretty.

On the other hand, I've seen many more defenses of 60 fps frame rates: namely that there's less input latency, animations look less jerky and generally look great in motion, jerky cameras aren't a thing, motion blur is much less of a necessity, it works better with fast paced games that benefit from twitch reflexes, and it generally just feels good to play.

Now, prior to the PS3, XBox 360, and Wii generation, also known as the Seventh Console Generation, pretty much every game you could think of had a 60 fps frame rate. On the PS2, GameCube, XBox, and DreamCast, you could list almost any game on those consoles and you'd be pretty much guaranteed a 60 fps framerate. Other than the PAL regions, which generally had 50 fps framerates for most of those ports due to the TV's they had during that time, the only PS2 game I can think of that did not run consistently at 60 fps was Shadow of the Colossus and, to be fair, it had a pretty ambitious goal of getting you to experience what it'd be like to climb and kill giant titan monsters so it had an excuse for running pretty schizophrenically.

Upon the release of PS3, however, games started shifting toward 30 fps framerates because Sony started pushing the Graphics angle on a device that really couldn't manage it that well and this incidentally also raised the budgets on AAA console games, since graphics of that quality are much more expensive to produce than graphics that will allow you to maintain a 60 Hz Refresh Rate. Microsoft quickly followed suit with the XBox 360 and, from there, pretty much only a few, very specific game devs were actually producing games the way they would for a PS2.

Pretty much the only exception to this rule universally was Nintendo and Developers of Wii games but I don't think that was due to good business sense, I think that was due to necessity. You see, for those of you who don't know, which would probably be like, what, 5 or 6 people in the entire world, the Wii decided to have Motion Controls as its major gimmick. The guy who pioneered this technology first set out to get it into military forces but was rejected when they thought it wouldn't work. Microsoft rejected it outright as well, and Sony rejected it after asking about how much it would cost to produce such a device.

Nintendo decided to take this man up on his offer but the technology proved to be so expensive that they had to limit the Wii's overall capacity to only double that of the GameCube so it didn't exaggerate costs beyond recovery. This meant that Nintendo, and developers for the Wii, were not able to push the "Graphics" angle because the Wii was so much weaker than the PS3 and XBox 360. As a result, many of the games pretty much had to rely on other things to keep their games afloat.

Motion sensitivity was a big part of that but performance was as well. Even so, though, there really weren't that many console games released during that generation that ran at 60 fps and looked good.

Devil May Cry 4 is an example of a game that looks a lot better than a lot of games released years after it was and still ran at a solid 60 fps at all times. By comparison, the reboot by Ninja Theory, DmC: Devil May Cry had a 30 fps frame rate on consoles and generally didn't look as good as DMC4 did.

Many have argued over why this is and one argument I've seen is the engines used to make these games. DMC4 was made in Capcom's MT Framework whereas DmC was made in Unreal Engine 3. This meant that DmC had problems in terms of its port to PS3 in particular because Unreal Engine 3 was not designed for that architecture. It was actually designed for standard PC's, which is why the game generally ran better on XBox 360 than on PS3 and looked better, too. By comparison, DMC4 had a much better PS3 port than an XBox 360 port, and though both of these games were clearly best suited to the PC, the Engines do serve as a starting point for why a developer might be able to make games that are radically different from each other in terms of overall quality.

One neutral defense that I've seen is simply that they don't care whether it looks good or runs well as long as the gameplay is good and while that's not really a terrible argument, as God Hand looks pretty standard for a PS2 game and runs at 30 FPS but has great gameplay, this highlights another issue with arguing for graphics over performance: a higher frame rate will almost always result in better, or at least smoother gameplay but prettier games are entirely disconnected from gameplay right up until the point that they start causing serious lag.

This is why the defense of graphics perplexes me a little bit. Of course no one wants the games they play to hurt their eyes but recent developments have proven that a game can be good without being notable in the graphics department, again, God Hand is my example, but an overly graphically-intensive game will have problems within it, and I will cite The Last of Us as my example. No, I'm not going with The Order: 1886 because that target's been beaten to death and is way too easy anyway.

God Hand, in terms of graphics, is comparable to other games of the time with similar art styles. In fact, some might even say it doesn't hold up as well as others do. If we compare God Hand's graphics to those of, let's say, Devil May Cry 3, there's a stark difference in how good those games look. Of course God Hand has some slightly better facial animation than DMC3 does but DMC3 looked better overall. That said, God Hand is still a game that is worthy of its spot within the top thousand of all time.

Compare that to The Last of Us, which is probably one of the best looking games on the PS3, to the point that even some PC gamers were astonished by how good it looked, and yet if it weren't for that, and the fact that it's story was well presented, if not good, it would've largely been disregarded as an overly scripted game with terrible AI. Of course The Last of Us is much better in the HD PS4 version because it actually runs at 60 fps but the fact that it needs that framerate boost to be more than tolerable in terms of gameplay goes to show just how much of a down grade the gameplay of the Last of Us and, by extension, Uncharted are from Jak and Daxter, the franchise from the same developer, namely Naughty Dog.

I will defend the Jak and Daxter games to hell and back because, whatever questionable decisions were made during the development of each of those games, at least they all functioned the way they were supposed to. Uncharted and The Last of Us, though, function when you follow their scripts. But we're getting off topic, so I'll revisit this at a later date when I have collected more thoughts on the matter.

Some people I have seen have also made the argument "It depends." By which they meant "If it's a Racing game or a Fighting game, 60 fps all the time but, if it's another genre, 30 fps and make it look better." I believe this is a bit short sighted. Devil May Cry 3 has great combat, as do the Ninja Gaiden games but, they probably wouldn't have worked as well, if they ran at 30 fps. In fact, within Ninja Gaiden, there's even proof of this: Ninja Gaiden 3 versus the re-release Ninja Gaiden 3: Razor's Edge.

NG3 ran at 30 fps and NG3:RE ran at 60 and it is astonishing just how much that boost in frame rate fixes so many of the problems. In particular, the Kunai Wall Climbing and the Steel-to-Bone Mechanics were largely meh in NG3 because they barely functioned at all. Kunai Wall Climbing had you hold each of the shoulder buttons one at a time in order to climb up a wall but sometimes, holding the button won't cause Ryu to climb up and if you let go too early, Ryu will even start to go down. In Razor's Edge, though, Ryu climbs so fast and so responsively that he can actually beat Gordon Freeman up a ladder without actually needing a ladder. With Steel-to-Bone, it required you to mash the triangle button on connection with an enemy in order to cut them up. In NG3, this meant pressing a button over and over in a way that looked and felt like it was a QTE (Quick Time Event). In Razor's Edge, however, if you do the same thing you can use the Steel-to-Bone mechanic to not only insta-kill a single enemy but you can also chain it into other enemies until you've cleared the entire screen. Mindless and not particularly skill intensive? You could make that argument. Great for crowd control? Absolutely.

Another example of how much a game is elevated by a higher frame rate is Bayonetta. The first one, not Bayo 2. I'm going with Bayo 1 here simply because it has three different ports: PS3, XBox 360, and PC. Bayo 2 pretty much only has Wii U. Now, Platinum games has gone on record stating that Bayonetta's PS3 port was entirely an after-thought that they didn't even plan on making in the first place. Originally, they wanted to make it exclusive to the XBox 360 and make a PC port later. The only reason they made the PS3 port at all was because Sega told them to and it shows. Bayonetta has the PS3 port single handedly taking the title of "Worst Port of this installment." A lot of people who played Bayonetta on PS3 found it underwhelming in a lot of ways, whereas the people who played it on PC and Xbox 360 loved the hell out of it, which goes to show how different the ports were.

Now, I'm not entirely certain what was changed between PS3 and XBox 360 but given how quickly they scrounged together the port, and how bad of a job they did, I imagine no real differences would exist in the Story or Content Departments. As a result, I can only assume that the problems exclusive to PS3 would have to have been the frame rate. Other problems, like attacks that aren't crunchy or fast enough, or bosses that are generally unmemorable, would probably be issues that are largely the same across all platforms, unless the frame rate made those even worse.

So that begs the question, "If high frame rates make games so much better, why do developers make games at 30 fps nowadays?" The answer to that is two fold: firstly, if they cap the frame rate at 30 fps, that means, more often than not, they can animate their games at 30 fps, which is just less expensive to do than animating at 60 fps generally speaking. This is a problem that's largely been sidestepped by progress in programming, game animation and game engines like Unreal Engine 4, where you can make an animation play at the rate it's supposed to independent of the frame rate with a small amount of code. The other issue is much more problematic.

Namely that it is easier for developers to rake in higher scores from reviews and sell their games in general if it looks pretty than if it runs well. Most reviewers will talk about a game's graphics to the point that some have a category dedicated to the graphics in their templates but, save for a few independent reviewers, a lot of reviewers out there won't even bring up the frame rate unless it dips below 30 fps at some point. This is simply because it's not really attractive to say "This game doesn't look as good as others because it's trying to perform properly." Some people, particularly people who care about performance and gameplay, will not buy a game until they hear the gameplay is good. However, the casual consumer, particularly parents buying games for their children, will, on average, buy a game based on how good the screenshots on the back of the box are.

So one final question before we wrap this up: what can be done about it? Well, probably nothing. The rise of indie gaming is probably going to have some impact on the framerate versus graphics thing for consoles and PC by providing games with performance and gameplay that is unique, polished, and fun because they are not restricted by publishers. Just as well, Platinum seems to be able to get away with pushing for 60 fps without too many issues in too many areas. But, for most AAA developers, even ones like Naughty Dog, who are allowed to do whatever they want for the games they make, will largely choose graphics over frame rate because they need their games to sell, probably much more than the Publishers do. While the Publishers stand to gain more money from a game they publish, if a Development team doesn't push out enough games that sell well, they may lose their jobs and have their team disbanded. Publishers face this risk as well, every business faces this risk but the only Publisher I can remember in recent times that went out of business because of poor sales was THQ and that was simply because they were so irresponsible with the IP's they owned that they spent far more money than they could make back and backed themselves into a corner. EA, Konami, Capcom, Activision, and especially Nintendo, Sony, and Microsoft aren't really at as much risk if one or two games don't make a profit.

That's all for today. Thanks for reading my incoherent rambling and I hope to see you next time.

No comments:

Post a Comment