Cities: Skylines 2 developers Colossal Order have disputed a rumour that its performance woes are being caused by overl…
falsem
link
fedilink
35
edit-2
1Y

Did anyone read the quote. They literally stated this was part of the problem:

We know the characters require further work, as they are currently missing their LODs which affect some parts of performance. We are working on bringing these to the game along general LODs improvements across all game assets.

@MJBrune@beehaw.org
link
fedilink
English
6
edit-2
1Y

The teeth themselves seemed to have gotten confused in the article. Apparently someone was claiming the life cycle system is simulating tooth growth. The characters overall not being loded is an issue but only in GPU performance, not CPU. I don’t know where the major performance issues are on the side of since I’ve not touched the game but it’s why they say it’s not the whole issue and the article claims it’s not the teeth. Which it’s not. It’s the characters overall.

I think they run a lot of compute shader, so that they can offload part of the simulation to the GPU, so anything that reduces the utilization of the GPU could improve performance overall.

@MJBrune@beehaw.org
link
fedilink
English
41Y

That’s still ignoring the whole character. Teeth aren’t the issue, it’s loding the character, you don’t do that just in the teeth. Teeth might add to the vert count but so does the rest of the high fidelity model. So overall it’s not the teeth. It’s the character.

It’s like being concerned the bathroom is on fire when the whole house is burning down.

Scrubbles
link
fedilink
English
1541Y

The most annoying thing about the cities performance issues isn’t even the performance issues. It’s all the gamers who overnight became experts in game performance that are ranting and raving online about how they obviously know how to optimize games more than professionals. It’s so tiring at this point.

Any software engineer with real professional experience can tell you performance tuning is a nightmare. It’s going through millions of lines of code checking for places you can allocate memory a bit differently. Checking collections and going back to your CS classes to make sure you’re using the best data structures. Watching performance tools and debugging for hours on end to catch that one place that slows down a bit.

People here, Reddit, and everywhere are just so tiring because they act like it’s so obvious. “Oh it’s the teeth”. “If they would have done X”. It’s honestly just so disrespectful to the full time engineers who no doubt have had those thoughts months ago. If items like this were simple, they would have done them already.

I give completely respect to the engineers who worked on this, and I respect Colossal Order’s push to still release early. As someone who is enjoying the game, zero crashes, and in my opinion completely playable, I’m happy they released now.

Gamma
link
fedilink
English
301Y

They’ve been experts for longer than a single night, don’t you remember when all anybody said was to jUsT FiX tHe NeTcOdE!

Bruh change your DNS /s

@JCPhoenix@beehaw.org
link
fedilink
English
21Y

Shh! You’re gonna tempt the DNS demons!

@millie@beehaw.org
link
fedilink
English
331Y

Even just modding I’ve noticed a lot of extremely confident opinion-giving that’s equally uninformed. I think people just like to feel like they have some special insight, so they tend to run with whatever the first narrative they hear is and stick hard to it. It reminds me of all those little bullshit factoids people love to repeat, like that daddy long legs are the most venomous spider but are incapable of biting people.

The big obvious example in DayZ is the myth of the ‘alpha wolf’. People have for ages been claiming that one of the two wolf textures (usually the white one, but I’ve heard both) is an ‘alpha’ wolf that’s stronger than the others and will cause the pack to run away if you kill it. This is a complete myth with no basis in the code of the game. One wolf type is a child class of the other and the only difference is their texture.

And yet some people will get extremely offended if you mention this. Even if they literally have never even peeked under the hood of DayZ and are well aware that you’ve been actively developing mods for it.

That said, there are cases of players noticing emergent behaviour in games! For example: https://twitter.com/JoelBurgess/status/1428008041887281157

@chipt4@beehaw.org
link
fedilink
English
61Y

Is there a way to view this without logging into Twitter?

Scrubbles
link
fedilink
English
151Y

This is exactly it. It’s more fun to shit on a game release because it gives a sense of superiority. “I know better than everyone else, this game should have been done this way and that way” and bolsters self confidence.

There are without a doubt some really good arguments for things that could be different, but the vast majority of things I read are self aggrandizing people talking about how they all know how it could be better - and that’s the arrogance that really bugs me. That any of us who don’t know anything about the source code could say at all that it should run better.

Saying “I wish it ran faster” is one thing. Saying “I know it could run better” or “Other games run fast, this one should too”, or in regards to this article “lol they did this thing that’s so stupid” and just the self backpatting for figuring it out. Software engineering is hard alone. Gaming engineering on top of that is just ridiculous. I have 14 years of software engineering under my belt and I still know they are doing things in this game that I would not be able to. Anyone who says they know better than the engineers are the same as the people who sat in my CS102 class and told other students they were smarter than the professor. You aren’t. Everyone knows you aren’t. Please stop acting like you are

I don’t work in games, but I do work in software and the people you describe are infuriating and have absolutely no idea what it’s like to work on a big piece of software. Thanks for the comment.

Scrubbles
link
fedilink
English
81Y

You don’t understand. I watched a YouTube video/took CS102/have a side project I’m totally going to finish. I totally know just as much as these engineers with 10+ years experience who put the last 5+ years into the project.

thingsiplay
link
fedilink
161Y

I give you right about those overnight experts in forums. 100% true. But I would not give too much respect to the developers (unless they were forced to released it early), because they knew the game was not ready to launch. It’s even their official statement: https://forum.paradoxplaza.com/forum/threads/updates-on-modding-and-performance-for-cities-skylines-ii.1601865/

Cities: Skylines II is a next-gen title, and naturally, it demands certain hardware requirements. With that said, while our team has worked tirelessly to deliver the best experience possible, we have not achieved the benchmark we targeted.

Then why the hell do you release the game? So it’s another rushed game and that is you can blame the devs for. That is what upsets me personally the most from all those drama.

Scrubbles
link
fedilink
English
121Y

They literally said why. Because a lot of players, like myself, don’t care about the performance issues and are happy to play it. That those who wanted to start deserved to get it early, and that by delaying it only punished us. And they’re right, like I said I am enjoying it, there’s a huge discord of people enjoying it. If some people just absolutely can’t handle 30fos then they are welcome to delay it for themselves and not buy it until hardware catches up

bermuda
link
fedilink
English
131Y

Then why the hell do you release the game?

because they are a business that needs to make money to offset development costs.

thingsiplay
link
fedilink
11Y

deleted by creator

.:\dGh/:.
link
fedilink
71Y

They’re “overnight performance experts” because there are similar games that run better.

To me it seems that there was a tight schedule and they couldn’t prioritize performance tweaks over features. I mean, if it’s works it works, refactor later so we can jump to the next requirement.

Sum all that up and you won’t know which part of the chain takes most cycles,

Yup. I’ve worked with some really great software engineers in the gaming industry, and they don’t have a fucking clue how to optimize a game, and it’s because optimizing the game doesn’t take a clue. It takes legwork, and diagnostics, and digging, and digging, and digging.

It’s never what you think, because if it was, it would have been fixed already.

We shipped well optimized games, and we did so because the games were (relatively) small, and our engineers were absolute pro sleuths.

As someone who is enjoying the game, zero crashes, and in my opinion completely playable

Not gonna lie, something tells me your opinion would shift within seconds if your computer wasnt working you a little extra magic to make this sentence true.

Kraiden
link
fedilink
81Y

Willing to throw my hat into the ring here and say that I haven’t even bought it yet because I know my pc can’t handle it. I will wait for performance patches (or look at finally upgrading my 5 year old pc)

I also think they’ve done everything right. They called it out BEFORE release, but released anyway for the subset of players who can play, with the promise of improving it for the rest.

The ones who can play it got lucky, the ones who can’t and are all pissed about it are the same ones who would be bitching if it got delayed.

Honestly they should have put it behind the beta window under the caveat that the “beta” was just bug fixing.

But Ive kinda noticed a trend where people who say that everyone else is overreacting, and that people are throwing tantrums over nothing, are pretty often people who have a spendy machine that brute forced past all the issues.

Like people who say “minecraft doesnt have a memory leak issue, just install a bit more RAM!” You havent solved the problem, you exist in a situation where you cant notice the problem.

ono
link
fedilink
English
24
edit-2
1Y

To be fair, one doesn’t have to be an automotive engineer to deduce something is wrong with a new car that struggles to reach 30km/h while most of the others exceed 100km/h with ease.

(This is the first I’ve heard of anyone blaming teeth, though. That’s a bit strange.)

shadowbert
link
fedilink
51Y

I don’t think the issue is with people deducing something is wrong with the game. The issue is people sayings “It’s definitely the fuel pump - why didn’t you give it a larger pipe?” because the windscreen wipers aren’t working.

Recognising an issue vs diagnosing it vs. figuring out a treatment. You can notice chest pains and shortness of breath, perhaps make an educated guess that it could be a heart attack, but it’s going to take an expert to diagnose whether that’s actually the case and what course of action to take.

@millie@beehaw.org
link
fedilink
English
171Y

There’s a big difference between looking at a game and saying there seem to be some performance issues versus baselessly pretending that you know what the specific cause of those issues is.

Skull giver
link
fedilink
2
edit-2
1Y

deleted by creator

@umbrella@lemmy.ml
link
fedilink
7
edit-2
1Y

IIRC there was once some bethesda game where characters teeth being rendered across the map was the actual issue.

edit: on second tought i think it was actually arma2/dayz/pubg or something similar

ono
link
fedilink
English
61Y

Ha… That is hilarious, and very much like Bethesda. (See also: the bee problem in Skyrim.)

@brsrklf@jlai.lu
link
fedilink
5
edit-2
1Y

What problem, you don’t think bees should be able to flip a horse carriage over?

Can’t stand the sight of a strong Nord insect?

Scrubbles
link
fedilink
English
141Y

That’s not a fair comparison. I see people upset because the car isn’t a masarati, when they didn’t build a masarati. They built a van. I don’t need to go 100km/h, I needed something that could carry all of these items I have. And for me, that runs fine.

I will say that I have a new(ish) gaming rig, built about 3 years ago. I do think minimum requirements are jokingly out of date, and those needed to be upped to not mislead people. I don’t think even a 1000 series GTX card could play this on minimum settings, let alone a 900. It’s better PR just to be up front and say “Look, those cards just aren’t going to cut it. If you can’t play day one, we’re sorry, but we’re excited to see you at your next upgrade” rather than lie and say it’ll be fine.

ono
link
fedilink
English
8
edit-2
1Y

That’s not a fair comparison.

I think it is. Note that I wrote 30km/h, not 200km/h. (In case you’re American, 30km/h is about 18mph.)

The Last of Us Part 1 is another example. We know it should run better on our hardware (at least with low-graphics settings) because we have already seen the original game run far better on less capable hardware. Yet this one fails to do so even at the lowest possible settings.

Even Baldur’s Gate 3, despite being otherwise wonderful, has some glaring hit-and-miss performance issues (think 8 fps at 1080p) that show up on hardware that can handle similar games easily. You don’t need to be a software engineer to compare it to Divinity: Original Sin 2, adjust for a few years of hardware inflation, and have a rough idea of how it should perform at moderate-to-low settings.

I see people upset because the car isn’t a masarati,

I don’t doubt that those people exist, but I believe they are outliers. Most of the complaints I see about underperforming games in the past year or so are from people with very reasonable expectations. If most of the gripes you’ve seen are from teeth-blaming Masarati-entitled loudmouths, I suspect it has more to do with the forums you frequent than anything else.

The Last of Us Part 1 is another example. We know it should run better on our hardware (…) because we have already seen the original game run far better on less capable hardware.

You cannot directly compare PC specs with those of a console. TLoU was made by Naughty Dog who are well known for squeezing absurd amounts or performance out of console hardware. The way to do this by leveraging a platforms specific strong points. The engine is very likely designed around the strengths of the console’s hardware.

PCs have a different architecture from consoles, with different trade-offs. For example: PCs are designed to be modular. You can replace graphics cards, processors, RAM, etc. This comes at a cost. One such cost is that a PC GPU has to have it’s own discrete RAM. There is a performance penalty to this. On a console things can be much more tightly integrated. I/O on a PS5 is a good example. It’s not just a fast SSD, it’s also a storage controller with more priority levels, it’s also a storage controller that interfaces directly with the GPU cache, etc.

ono
link
fedilink
English
91Y

Sigh… You conveniently deleted important parts of my comment, such as “at least with low-graphics settings” and “adjust for a few years of hardware inflation”, and completely ignored the fact that I am talking about cases of abnormally bad performance compared to entire categories of games. The straw man you’re arguing against is not what I wrote.

You conveniently deleted important parts of my comment, such as “at least with low-graphics settings” and “adjust for a few years of hardware inflation”,

No, that just supports my theory. Graphics settings usually scale really well, that’s the reason they are adjustable by the end-user in the first place. Those should not cause any of the issues you are talking about. The problems lie in parts that take advantage of certain architectural differences.

A hypothetical example that highlights a real architectural difference between consoles and PCs:

Say you have a large chunk of data and you need to perform some kind of operation on all this data. Say, adjust the contents of buffer A based on the contents of buffer B. It’s all pretty much the same: read some data from A and B, perform some operation on it, write back the results to A. Just for millions of data points. There are many things you could be doing that follow such a pattern. You know who’s really good at doing a similar operation millions of times? The GPU! It was made specifically to perform such operations. So as a smart console game developer you decide to leverage the GPU for this task instead of doing it on the CPU. You write a small compute kernel, some lines in your CPU code to invoke it. Boom, super fast operation.

Now imagine you’re tasked with porting this code to the PC. Now, suddenly this super fast operation is dog slow. Why? Because it’s data generated by the CPU, and the result is needed by the CPU. The console developer was just using the GPU for this one operation that’s part of a larger piece of code to take advantage of the parallel performance of the GPU. On PC, however, this won’t fly. The GPU cannot access this data because it’s on a separate card with it’s own RAM. The only way to get to the CPU is through the (relatively slow) PCIe bus. So now you have to copy the data to the GPU, perform the operation, and then copy the data back to system RAM. All over the limited bandwidth of the PCIe bus, that’s already being used for graphics-related tasks as well. On a console this is completely free, the GPU and CPU share the same memory so handing data back and forth is a zero-cost operation. On PC this may take so much time that it’s actually faster to do on the CPU, even though the CPU takes much more time to perform the operation, simply to avoid the overhead of copying the data back and forth.

If an engine uses such an optimisation this will never run well on the PC, regardless of how fast your GPU is. You’d need a lot of years of ‘hardware inflation’ before either doing it on the CPU or doing it on the GPU + 2 times the copy overhead is faster than just doing it on the GPU of the console with zero overhead.

In fact, things like this is why Apple moved away from dedicated GPUs in favour of a unified memory model. If you design your engine around such an architecture you can reach impressive performance gains. A good example of this is how Affinity Photo designed their app around the ‘ideal GPU’ that didn’t exist yet at the time, but which they were expecting to in the future. One with unified memory. When Apple finally released it’s M-series SoCs they finally had a GPU architecture that matched their predictions and when benchmarked with their code the M1 Max beat the crap out of a $6000 AMD Radeon Pro W6900X. Note that the AMD part is still much faster if you measure raw performance, it’s just that the system architecture doesn’t allow you to leverage that power in this particular use-case.

It’s not just how fast the individual components are, it’s how well the are integrated and with a modular system like a PC this is always going to cause a performance bottleneck.

@millie@beehaw.org
link
fedilink
English
51Y

I mean, you kinda do, though. You have no idea what’s going on under the hood in Divinity versus Baldur’s Gate. Even if the graphics are similar and the UI looks the same, there could well be much more complex systems involved. Given that they’ve developed a faithful and fairly wide-ranging representation of D&D 5e, I’m willing to bet that ended up being a lot more involved than their own proprietary system.

ono
link
fedilink
English
5
edit-2
1Y

Given that they’ve developed a faithful and fairly wide-ranging representation of D&D 5e, I’m willing to bet that ended up being a lot more involved than their own proprietary system.

That game was just one example, but since you seem interested in singling it out:

Turn-based game rules cannot explain the awful graphics performance that game has, even at idle, on some systems. (Not even D&D 5e, which I happen to know in detail.)

Graphics engine enhancements might explain it, but in that case, the developers should have included options to disable those enhancements.

I haven’t reverse engineered the code, but some of the behaviors I’ve seen in that game smell strongly of decisions/mistakes that I would expect from a game that was rushed, such as lack of occlusion culling. Others smell like mistakes that are common among programmers who haven’t yet learned how to use the graphics APIs efficiently, such as rapid-fire operations that should instead be batched. Still others could be explained by poor texture and/or model scaling techniques. As a software engineer, the bad performance in this particular game looks like it could come from a combination of several different factors. None of them are new in this field. All of them can usually be avoided or mitigated.

In any case, the point is that none of that analysis matters for the sake of this discussion, because a community with experience using products doesn’t have to be experienced in building them in order to notice when something is wrong. It’s not fair to categorically dismiss their criticism.

(Thankfully, the Baldur’s Gate 3 developers haven’t dismissed it. Instead, they are working on improving it. Better late than never.)

Dark Arc
link
fedilink
English
5
edit-2
1Y

I largely agree with what you’re saying but I’m going to add… If you get to the point of release and you’re off 300% and not 15% … you screwed up.

There definitely aren’t easy answers to these kinds of problems but there are steps that should be taken along the way to prevent them. Getting to the end and then addressing any and all performance issues is a recipe for disaster.

You don’t want to be making major architectural changes at this point in the process. You want to be dealing with hiccups. Throwing hardware at the problem and “optimization” only go so far.

In Paradox defence they released a patch pretty quickly once it was released to the masses with miriad of different hardware configs.

To be honest, they prob should have just beta early released it.

This way they could have caveated any issues and allowed the feedback to refine the codebase.

£70 ish quid for the premium edition to run like shite takes the biscuit.

skulblaka
link
fedilink
191Y

But yet if they released it Early Access to crowdsource their QA, people would have dogged all over them about “what’s with the EA bullshit, just release the full game when it’s finished”

Personally, I’m a huge fan of Early Access, I like playing 3/4 finished games and having actual tangible input on the finishing touches. It’s made several games that I already really liked in their EA state, into masterpieces.

But your average gamer just wants to buy a game and have it work perfectly. When it doesn’t, tantrums happen.

Narrrz
link
fedilink
21Y

I used to hate early access - why should we pay to test an unfinished game, when that’s an actual job that people get paid to do?

but I’ve come to recognise that it’s am important avenue for funding for many developers, and tbh, I don’t think any of the early access games I’ve played have felt “incomplete” - perhaps lacking polish, perhaps in need of more content, but that’s true of many full releases, and early access not only gets you these games at a reduced price, it effectively guarantees a large amount of free DLC as the game gets made more complete.

my only real complaint now is sometimes I like early access features which end up getting cut from the finished game.

@thejml@lemm.ee
link
fedilink
English
91Y

I feel like if they:

  • released earlier with Beta: tantrums
  • delayed to get perf up: tantrums
  • cut features/details to get perf up: tantrums
  • released on time w/perf issues: tantrums

I just ignore it. I have a fairly new setup and turned a few things down, so I can get 70 +/- 10 most of the time, but I trust they’re working on it so I can turn them back up later. Perf testing on a huge myriad of different system setups is hard to do. At least they didn’t pull a “here it is, we’re done.” Like some other groups might have. They acknowledged it, they announced the low perf and their continued work, and they released anyway so people who want it and can play it, get to.

coyotino [he/him]
link
fedilink
English
251Y

“… Characters feature a lot of details that, while seemingly unnecessary now, will become relevant in the future of the project.”

Custom Dentist Minigame DLC confirmed!

Either that, or “News Cameraman with Carnage Closeup Mode DLC”…

Kaldo
link
fedilink
81Y

It was a ridiculous claim to begin with, there’s no way they wouldn’t see something like that when analyzing with internal tools or doing any performance runs.

Create a post

From video gaming to card games and stuff in between, if it’s gaming you can probably discuss it here!

Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.

See also Gaming’s sister community Tabletop Gaming.


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

  • 1 user online
  • 45 users / day
  • 148 users / week
  • 530 users / month
  • 1.72K users / 6 months
  • 1 subscriber
  • 2.65K Posts
  • 42.6K Comments
  • Modlog