Friday, January 25, 2019

After-Thoughts Rant: Graphics are NOT Important

Today's a rant because I feel like it. Topic lead in, strawmen are everywhere. Tell me something, you ever state your position on a topic, explain your position to the best of your ability, then your opponent seems to go on tirades about how you're wrong then uses a video they found to explain why you're wrong, then you watch it and you realize that the video doesn't even address your stance in any notable way?

To put this into context, I was using my search engine of choice to look up how to make PS2 environmental textures and I found a post that used the key word PS2 on a Square Enix forum about Final Fantasy XIV and everyone was just complaining about how the textures were really bad when, firstly the textures seemed fine to me when they posted images, and secondly the people in favor of improved visuals were not in any way altruistic and were, in fact, entirely selfish. The final post in that thread was from someone who, not assuming age, though they did seem very immature, explaining that graphics are necessary for games to be presentable and he linked a video explaining why.

Granted, I didn't watch the whole video because it was almost an hour long and the video itself seemed more interested in discussing the history of graphics rather than debunking anyone's stance on graphics, which is fine. Then I went into the comments section and I noticed two things. Firstly, the only people who seemed to like the video were the ones who already agreed with his stance. And secondly, that stance everyone seemed to take was based on the quote at the very beginning of the video "People say graphics aren't important but every game I've ever played has had graphics."

I take two issues with this quote. The first is that it seems to suggest that text based games never existed, which is a bit reductive to the discussion but ultimately it's a nitpick and I'll let it go. The second issue I have, however, is that it misunderstands the point people like myself take on why graphics aren't overall important. I'll address some of the points I read in that forum post but first I want to tackle this thesis statement that everyone seems to want to lean on.

That is simply that they all seem to believe our stance is that graphics can be removed from a game entirely and nothing would change about it. Obviously no one's saying that but no one seems to understand what the point is. The point is that graphics have progressed so much ever since the inception of video games and 3D games especially, that the extremely high standards that graphics are being held to simply isn't impressive anymore.

Keep in mind, my stance on any given technology is about what that technology adds in terms of potential for games development. Pong was a very important game in the history of the medium because it showed what could be possible if hardware developed. Color displays were important to gaming history because of how color coding could affect the overall design of a game and how that might break some new ground. 3D graphics were important to gaming history because of what the third axis provided in terms of potential for game design, such as exploration, combat, puzzle solving, and even world building.

However, as time has passed, milestones like these have diminished significantly. The Wii introduced Motion Controls and that is being extended into VR as of this moment but that is probably the last milestone that we seem to be approaching in terms of overall technology.

In terms of graphics and visual design, though, there's not a lot that can be done with modern hardware and design capabilities that couldn't be achieved in some form on the PlayStation 2. Sure texture sizes have gotten larger, polygon counts have risen, the number of maps you can work with is greater, and there's a lot more accessibility in terms of how you can access things that can make a high quality game for a low price but in terms of what you want to achieve in the game, the PS2 did it all first.

This is the first issue with the counterpoints I came across. Everyone wants to increase graphical fidelity but why? It will increase the amount of money needed to be spent on the game if you want everything else to be up to par, it won't make any in-game worlds any more believable, it won't do anything to other areas of equal or greater importance like sound design or gameplay, and even if you get the highest end graphics possible, that runs the risk of making it so that almost no one can play the game.

This brings me to the second argument that I came across, the idea that the developers should abandon lower end hardware and focus exclusively on higher end hardware that can handle high end graphics. Of course this person was given the counter argument that if you do that, it will reduce the overall pool of people who can actually play the game but his response was simply "If their hardware can't support it, they have other things to worry about besides gaming."

While factually correct to an extent, this statement makes it clear just how self-centered this person is. I'm not even joking that their entire argument is based on the best experience for themselves and when presented with the idea that it wouldn't be great for everyone, his response was simply "Not my problem."

That said, selfishness in and of itself isn't always a bad argument so instead I'm going to address why it's not a good idea for game designers to be as dismissive as this person is. The argument they presented is that if the graphics were improved it would increase the total amount of people who would play the game. The problem here is that these two arguments are completely at odds with one another.

While it is true that if everyone is operating on the same hardware, the best experience for one gamer would likely be the best for all of them, in truth this is almost never the case. Building a game solely for high end hardware excludes people who absolutely would like to play the game but do not have access to that hardware, either by finances or by some other circumstance.

Also just an aside, the statement that their low end hardware is their issue to deal with is incredibly offensive to me because it comes with the implication that people who make less money or who have less access to higher end technology are inherently unworthy of a good gaming experience.

The thing is, people in low or lower middle class financial situations still need entertainment. They still need to be able to release their stress like anyone else. The thing is, stress, as Jim Sterling has said, isn't just a metaphorical killer, it's a literal one. Stress of great enough degrees can cause harm of all kinds, whether it's some kind of injury, psychological break down, some kind of illness, or even a chemical imbalance like depression. If someone doesn't have access to something that can lower their stress, they are more likely to drive themselves into ruin and, on a more selfish note, potentially harm anyone who's around at the time it happens. To dismiss such an outcome really shows just how little this person values other people.

But to address the point, while it is true that having higher end hardware on the part of the player improves the total number of games they'll be able to play, the reverse is the case for many game developers, the lower end hardware that can support the game, the more people that can afford to buy in.

Although this person is very dismissive of console lines, I want to use an analogy I came up with for one of my friends earlier this week. "Buying a game that is available on your device, even at double the price of the game you actually want to play, is most likely still less expensive than getting that game along with a device that can support it." No where is this more true than the PC market.

Despite what you may believe, there are many PC gamers out there who cannot afford the highest end hardware and, in fact, are running machines that are weaker than the PS4. These people do not have the necessary hardware to play a lot of games so instead they head for games that their machines can handle, usually of the indie variety. However, if a really popular, high quality game comes out and can be supported on one of those lower end machines, it doesn't take anything away from those who have stronger hardware but it does give those with lower end hardware a chance to be part of the discussion. The reverse, however, is not the case.

Now of course, if you have a machine that's weaker than a PS4 you're probably not playing Final Fantasy XIV or other MMORPG's like it. However, increasing the overall graphical fidelity for those machines that can support it comes with the risk of locking out those who had strong enough hardware to play it before but don't after the upgrade.

Okay, so you know how some people like to talk about PC gamers as if they're spoiled children, like I have in the past? Well, this entire discussion and the points I'm addressing are sort of indicative of why people hold that idea.

Many PC gamers, especially those who have high end hardware, are from very well off families or who otherwise have a lot of money to spend. To put this in perspective, the strongest Gaming PC that I want to get for myself at some point costs no less than $37,000, which is more than a large majority of people make in the span of a year, and more than a large percentage of them make in more than 1 year. I'm not saying every PC gamer is running a machine that expensive but suffice it to say, you aren't the kind of person who talks about super high end hardware without having the money or connections that can give you access to that hardware.

Upon that point, when they then go on to say that higher end graphics would improve sales are speaking under the impression that everybody has the money to drop on that level of hardware, and when faced with the idea that they don't, the response of "whatever" or "not my problem" make it pretty clear that those kinds of people have never had the thought cross their minds that if the developers of their favorite online or even single player games gave in to their demands that there is a very real chance that it would sink costs so much that those developers would stop making games and those online games would be shut down permanently.

As for my stance on what is the most important aspect of a video game, that's obviously the gameplay. The thing is for the "go to this other thing if you're only in it for this one thing" arguments that people were making in that thread, gameplay is the only area of the discussion where that argument can't be applied. Just to demonstrate it, let's use that strawman in a variety of ways right now.

If you only want art design, go to an Art Museum.
If you only want story, read a book.
If you only want sound design, download some MP3's of sound effects.
If you only want music, go find some on iTunes or Sound Cloud.
If you only want animation, watch an animated short film.
If you only want voice overs, listen to an audio book.

Obviously, for things like games, movies, and TV, they'll want combinations of these things. However, of all of those things that were just listed, gameplay is the one aspect of a video game that you cannot get anywhere else. It doesn't matter if you're playing on PS2 or PS4, Nintendo or SEGA, Atari or Microsoft, PC or Mac, the gameplay is what separates games from everything else. After all, nobody looks at movies and says "This gameplay is way better than an actual video game." And if they did, I'd like to meet them so I can study how their minds work.

Finally, I want to address one final argument because I don't quite understand why it was made in the first place. When it was brought to this person's attention that no one was able to play Final Fantasy XIV on its initial release, they argued that it was because of poor optimization and not visuals. Like a lot of this person's arguments, it makes sense on a surface level but when you dive deeper into it, it makes you wonder how much he truly understands about game development. In this specific case, the underlying issue is that visuals and optimization are somehow not related.

For those who don't quite get that, optimization is just the idea of taking an existing software and making it run flawlessly on the lowest end hardware that it is targeted for. Usually if a game is poorly optimized, it's because something broke somewhere within the code of something on the device.

For example, if a game made for the highest end PC can run at 60 fps on that but on a device that's about half the strength it runs at 30 fps, that's simply because the hardware you're running isn't strong enough to handle it. In the reverse case, if it runs at 30 fps on the highest end hardware, and 60 on a device that's half the power, usually that's because the higher end hardware has something going wrong, either the Operating System isn't built for that game, or the drivers needed to support it aren't up to date, or certain code libraries that are needed to run on the device aren't present on the machine for some reason.

Finally, in a third case, where the game is built for the highest end hardware and the specific drivers on that hardware at 60 fps and the other machine has neither, the game likely won't run at all. Either it'll crash immediately or, if it opens it won't ever load.

The major thing is that stronger hardware usually fixes a lot of issues with some facets of the design. If the game is poorly coded, that's one thing. However, a lot of things like problems with the specific engine functionality are present, like for example, a post process material doesn't show up immediately, or certain visual effects aren't behaving the way the developers thought they would, improving the power or swapping a faulty part will usually solve it.

The problem here, again is that not every gamer or even PC user honestly has the money to be able to do something like that and developers expecting people to keep up with their game rather than accommodating those who can't is simply begging for lost sales and possibly even financial debt.

To understand why this is such an issue, though, you have to look at what every part of a gaming device and PC do. The short answer, RAM holds all the assets and code that are currently in use so that the specific part of the software can operate properly, the GPU or graphics card is intended to draw everything to the screen, the hard drive is intended to hold all the data in storage for later use, and the CPU is intended to tell everything what they're supposed to do. However, while this is all fine and dandy, the problem is a fifth part that usually goes unmentioned in discussions like this: the motherboard.

The motherboard is the piece of the device that connects every single part together and distributes electricity from one part to another. So what does this have to do with the discussion? The thing is not every motherboard is built to support every single PC part.

Firstly, every motherboard has a different configuration of slots. While most motherboards these days have pcIe slots, RAM slots, a slot for the hard drive, a slot for the processor, and slots for most anything else, many of these slots are limited in terms of either supported power or total wattage.

For example, let's say your device has 8 RAM slots but it supports a maximum of 64 GB of RAM. This means that the largest amount of memory each slot can support is around 8 GB. This of course, isn't even factoring in differences in RAM Types. While most desktop motherboards won't be using SDRAM, many of them will differ in terms of the overall type and speed of the RAM. For example, some motherboards will only take DDR3, some will take only DDR4, some with take both, and some will only take certain speeds of either, like 1300 MHz or 2700 MHz or something to that effect.

To my knowledge motherboards don't seem to have a hard set limit on dedicated Graphics RAM as long as the wattage stays within a certain amount. However, even the wattage isn't a hard limit. While a GPU with 16 GB of Graphics RAM will run worse on a machine that can't handle the overall wattage versus a machine that can, in general there doesn't seem to be a hard set limit on overall performance. That said, most motherboards will only take GPU's from certain families and manufacturers.

Without going any further into detail, I'm sure you understand at this point that even with the upgrading possibility of PC's there's a limit on how much you can upgrade the machine you have part by part before you need to get a new device entirely because if you need to replace the motherboard, you very likely cannot keep any of the parts that are in the current machine.

With this in mind, it's much more reliable for game developers to make their game suitable for the lowest end hardware they can find. Especially when if what other people on that thread were saying is true, better optimization wouldn't have helped the game's success too much.

I would address his argument that the initial Final Fantasy XIV had bad gameplay as a reason it failed, but honestly that's shifting the goal post here and it's in line with what I've been saying so far here anyway.

Okay, that's everything I wanted to say. Here are the links to what I'm talking about in the event you want to verify anything I've said thus far.

Why do the textures look like PS2 games?

A Brief History of Graphics

Have a wonderful evening and a wonderful life.

Addendum: Okay, so I fully watched the A Brief History of Graphics video just in case he made a point that I missed. And, lo and behold, my initial assumption was correct. To be quite honest, I don't think he makes a single argument in the entire video and what can be construed as an argument if you squint at it hard enough doesn't address what I actually say in this post. So once again, that person in the FFXIV thread was completely wrong.

1 comment:

  1. Truly hard work, That's why i bestowing an amazing font which can work in all condition;

    Royalty Free Fonts

    ReplyDelete