trash
fedilink
@xuxxun@beehaw.org
link
fedilink
English
101Y

I care about the story of the game and general enjoyment. As long as I can see and understand what is going on my screen, i do not care that much about the graphics. I am fine with playing old games with potato graphics. Also, I know we could have both, but if studios had the choice between more accesibility options for more different demographics, or better graphics, i wishthey would always choose to put the resources in accessibility.

alfisya
link
fedilink
51Y

RX550+768p gang, rise up!🙌

@Doods@infosec.pub
creator
link
fedilink
21Y

It’s actually a Powercolor low profile one that I “overclocked” to a normal 550’s frequency, never passed 70C° though.

Dizzy Devil Ducky
link
fedilink
English
41Y

My stance on graphics is that the realistic graphics of now compared to a game like Fallout New Vegas is that, to me, they don’t look very noticeably better unless I actively pay attention to them (which I don’t do whenever I play a game).

Then again, I usually don’t play triple AAA studio games anymore, so my perception is skewed by either older games or ones with a lot more stylized graphics.

Triple AAA

Ah yes, AAAAAAAAA

I feel that sometimes realistic graphics are what a game needs- like some simulators or horror titles going for that form in immersion. We’re not quite over the uncanny valley in AAA titles, and pushing the boundaries of real-time rendering technology can lead to improvements in efficiency of existing processes.

Other times, pushing the boundaries of realism can lead to new game mechanics. Take Counter-Strike 2 and their smoke grenades: they look more realistic and shooting through them disturbs only a portion of the cloud.

I do miss working mirrors in games, though.

Space Sloth
link
fedilink
61Y

So I’m an environmental criminal just because I enjoy graphics and have an expensive hobby?

You’re hardly responsible for the actions and decisions of several large companies. Your choice to buy a fancy card or run a powerful rig are not even a drop in the bucket compared to the actions of companies that shoe horn in the tech even when it isn’t wanted just so they can sell more.

I like seeing advances in graphics technology but if the cost is 10 year dev cycle and still comes out s-s-s-stuttering on high end PCs and current gen consoles then scale back some.

I think we hit a point where it’s just not feasible enough to do it anymore.

“I want shorter games with worse graphics made by people who are paid more to work less and I’m not kidding”

I saw that episode. Can’t disagree.

EDIT: Never mind, I thought that was a sarcastic comment mocking the other user.


And what’s wrong with that, exactly? Would you prefer broken games made by under paid and overworked people?

As for “worse graphics”, AC: Unity came out in 2014, The Witcher 3 came out in 2015, and the Arkham Knight is also from 2015. All of those have technically worse graphics, but they don’t look much different from modern games that need much beefier systems to run.

And here’s AC: Unity compared to a much more modern game.

It’s from a tweet. It’s earnest. You can google the quote to get more context.

I’m pretty sure that’s in support of the concept.

Ah, by bad. I didn’t even realize it was a known quote, I just thought it was a sarcastic reply making fun of the other user.

@JoshNautes@lemmy.ml
link
fedilink
1
edit-2
1Y

You picked the absolute best examples of their respective years while picking the absolute worst example of the current year, that makes the comparison a bit partial, doesn’t it? Why not compare them to Final Fantasy XVI or one of the remakes like Dead Space or Resident Evil 4? Or pick the worst example of previous years, like Rambo: The Video Game (2014)? While good graphics don’t make a good game, better hardware allows devs to spend less time doing better graphics. 2 of the 3 examples you gave have static lightning (ACU and BAK), while the bad example you gave have dynamic lightning. Baking static lightning into the map is a huge time consuming factor while making a game, I assure you that from my second hand experience that at least 1 of those 2 games you mentioned had to compromise gameplay because they couldn’t change the map after the light-baking was done. And I’m just scratching the surface on the amount of things that are time consuming when making good graphics like the games you mentioned. As an example, you have the infamous “squeezing through the cap” cutscene that a lot of AAA of the last generation had, because it allowed the game to load the next area. That was time wasted on choosing the best times to do it, recording the scenes, scripting, testing etc. etc. All because the hardware was a limiting factor. Now that consumers have better hardware that isn’t a problem anymore, but consumers had to upgrade to allow it. That was also true for a lot of other techniques like Tessellation, GPU Particles etc. The consumers had all to upgrade to allow devs to make the game prettier with less cost. And it will also be true with ray-tracing and Nanite, both cut a LOT of dev time while making the game prettier but requires the consumers to upgrade their hardware. Graphics are not all about looks, it is also about facilitating dev time which makes the worst looking graphics look better. If Rambo The Video Game (2014) was made with the tech of today, it would look much better while costing the devs the same amount of time. Please don’t see make comment as a critique, I’m just trying to make you understand that not everything is black and white, especially on something that is as complex as AAA development.

EDIT: I guess the absolute worst example of the current year would be Gollum, not Forspoken.

If Rambo The Video Game (2014) was made with the tech of today, it would look much better while costing the devs the same amount of time.

I don’t think this is quite correct. A while back devs were talking about a AAApocalypse. Basically as budgets keep on growing, having a game make its money back is exceedingly hard. This is why today’s games need all sorts of monetisation, are always sequels, have low-risk game mechanics, and ship in half broken states. Regardless of the industry basically abandoning novel game engines to focus on Unreal (which is also a bad thing for other reasons), game production times are increasing, and the reason is that while some of the time is amortised, the greater graphical fidelity makes the lower fidelity work stand out. I believe an “indie” or even AA game could look better today for the same amount of effort than 10 years ago, but not a AAA game.

For example, you could not build Baldur’s Gate 3 in Unreal. This is an unhealthy state for the industry to be in.

Yeah, I agree with everything you said. But what I was trying to say is that it is not all of the graphics push that are hurting production, I believe that on this generation alone we have many new graphics techniques that are aiming to improve image quality at the same time that it takes the load out of the devs. Just look at Remnant II that has the graphical fidelity of a AAA but the budget of a AA. Also, some of the production time is increasing due to feature creep that a lot of games have. Every new game has to have millions of fetch quests, a colossal open world map, skill trees, online mode, crafting, looting system etc. etc. Even if it makes no sense for the game to have it. Almost every single game mentioned on this thread suffers from this. With Batman being notorious for their Riddler trophies, The Witcher having more question marks on the map than an actual map, and Assassin’s Creed… Well, do I even need to mention it? So the production time increase is not all the fault of the increase in graphical fidelity.

Yes, it basically boils down to diminishing returns, it also eats up all the good potential for creative graphic design.

@Doods@infosec.pub
creator
link
fedilink
7
edit-2
1Y

I don’t think higher graphics requirements hurt creativity, you can have an unrealistic looking game that is very GPU-intensive, I was mainly concerned about the costs and wasted money/efforts.

But lowering the graphics budget - and the budget in general - can make creativity/risk-taking a more appealing option for AAA studios.

Edit : I just noticed both sentences kind of contradict each other but you get the point.

It’s not really worth it considering the jump in processing power that is needed imo. My last PC lasted me about 10 years while my current one can’t play current titles after 3 years and I already have to upgrade the CPU.

jasonhaven
link
fedilink
101Y

I’m a big proponent of going with a distinct art style over “realism”, because the latter tends to kind of fall apart over time as technology improves. The former, will always look good though.

Ser Salty
link
fedilink
51Y

Even if you do go with realism, we’ve hit a point of diminishing returns. Most PS5 games just look like the best looking PS4 games, but in 4K. I’d rather developers start using the system resources for things that actually matter instead of realistically simulating every follicle of a characters ass hair. Like, give me better NPC AI, give me more interactive environments, give me denser crowds, more interconnected systems. Just something.

I like it the way it is: There’s both and gamers decide what to buy. In the end, we are talking about a MASSIVE economy so of course there are also a lot of people who WANT you to upgrade your PC / console every 2 years or so.

I already wrote another comment on this, but to sum up my thoughts in a top comment:

Most (major) games nowadays don’t look worlds better than the Witcher 3 (2015), but they still play the same as games from 2015 (or older), while requiring much better hardware with high levels of energy consumption. And I think it doesn’t help that something like an RTX 4060 card (power consumption of a 1660 with performance slightly above a 3060) gets trashed for not providing a large increase in performance.

still play the same as games from 2015

I wish they played more like games from the late 90’s, early 2000’s, instead of stripping out a lot of depth in favor of visuals. Back then, I expected games to get more complex and look better. Instead, they’ve looked better, but played worse each passing year.

@McLovin@beehaw.org
link
fedilink
English
171Y

Its not so much the card itself, its the price and false market around it (2 versions to trick the average buyer, one with literally double the memory). Also its a downgrade from previous gen now cheaper 3070. Its corporate greed with purpose misleading. If the card was 100 € cheaper, it would be actually really good. I think that is the census on reviewers like GN but don’t quote a random dude on the Internet ahah

@cobra89@beehaw.org
link
fedilink
English
31Y

It’s less of the fact that there is a version with double the memory and more of the fact that the one with less memory has a narrower memory bus than the previous generation resulting in worse performance than the previous generation card in certain scenarios.

I don’t know about the 2 versions, but the 3070 bit is part of what I mean.

Price has been an issue with all hardware recently - even in regard to other things due to inflation in the last few years - but it’s not exclusive to the 4060. But more importantly, from what I can tell, the 3070 has a 1.2x to 1.4x increase in performance in games, but it consumes about 1.75x the power (rough numbers, i’m kind busy rn). Because I don’t have much time right now I can’t look at prices, but when you consider the massive difference in consumption, the price different might start making more sense and only seem ridiculous if you just focus on power.

ArtZuron
link
fedilink
English
101Y

I remember seeing an article somewhere about this. Effectively, there really bad diminishing returns with these game graphics. You could triple the detail, but there’s only so much that can fit on the screen, or in your eyes.

And at the same time, they’re bloating many of these AAA games sizes with all manner of garbage too, while simultaneously cutting the corners of what is actually good about them.

@averyminya@beehaw.org
link
fedilink
English
71Y

There’s definitely something to be said about proper use of texture quality. Instead of relying on VRAM to push detail, for games that go for realism I think it’s interesting to look at games like Battlefield 1 - which even today looks incredible despite very clearly having low quality textures. Makes sense - the game is meant to be you running around and periodically stopping, so the dirt doesn’t need to be much more than some pixelated blocks. On the other hand, even just looking at the ground of Baldur’s Gate 3 looks like the polish rest of Battlefield 1 visual appeal.

Both these games are examples of polish put in the right places (in regards to visual aesthetics) and seem to benefit from it greatly with not a high barrier for displaying it. Meanwhile still visually compelling games like 2077 or RDR2 do look great overall but just take so much more resources to push those visuals. Granted there’s other factors at play like genre which of course dictates other measures done to maintain the ratio of performance and fidelity and both these games are much larger in scope.

Mostly I just want games with good stories that are really, really fun to play. And games where I can play with 1-8 of my friends. Games like Sons of the Forest or Raft are perfect for this.

I’ve been honestly blown away with how newer games look since I upgraded my graphics card.

Remnant 2 is not even a AAA game but does such a good job with light and reflections that it looks better than anything released 5+ years ago.

Then you have games like Kena: Bridge of Spirits, which have a very nice art style but take advantage of current hardware to add particles everywhere.

I gotta be honest, the old tomb raider looks more real than the newer ones

I also mostly played slightly older games on not quite cutting edge hardware, but since upgrading to something half modern there are a few games that really impress with their graphics. Cyberpunk 2077 springs to mind. It’s the near photo realism of the environments that impresses mostly. The depth and detail in a scene must be really demanding, but damn if it doesn’t look really nice. I think we’re now at a point where it’s going to be only the bigger developers who can really make the most of the hardware as designing the worlds can be so unique, detailed and time consuming to make. Smaller developers aren’t going to have the time or budget to build such detailed worlds that will push top hardware (assuming equally well optimized code)

That’s why I almost exclusively play indie games. They don’t invest massively in graphics, microtransactions or dumb features not related to the game (like the chess/darts/drinking simulators in Watchdogs). Instead, they focus on making games that do one thing and to that one thing great.

Create a post

From video gaming to card games and stuff in between, if it’s gaming you can probably discuss it here!

Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.

See also Gaming’s sister community Tabletop Gaming.


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

  • 1 user online
  • 119 users / day
  • 187 users / week
  • 482 users / month
  • 1.87K users / 6 months
  • 1 subscriber
  • 2.48K Posts
  • 40.1K Comments
  • Modlog