Starfield’s numbers have swollen in early access on streaming and gaming platforms - and the global release is yet to take place.
From video gaming to card games and stuff in between, if it’s gaming you can probably discuss it here!
Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.
See also Gaming’s sister community Tabletop Gaming.
This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
I’m watching a streamer play the game, and what I see looks like I’d have some fun, and others probably feel the same way.
I’m just not interested in playing at like 30fps on a 3080. Maybe some patches or driver updates can improve things and I’ll check it out in the Steam Winter Sale or something.
Being a patient gamer is best. We need to give it at least 6 months for Bathesda to optimize the game and fix the major bugs.
Modders already on top of optimization. DLCC 2.0 mod already out there. Runs 40-50 fps 4K/ultra on my system unmodded. 3080ti, 12900k, 128GB DDR5, m2.
One can hope, but I kinda doubt that they can pull out like a 50% performance uplift, unless there’s just some weird bugs that tank performance.
They haven’t even fixed Skyrim yet.
I hope that since Bathesda is now a Microsoft studio, Microsoft is going to be persistent with bug fixes. But maybe it’s a tall order.
Take a look at Halo Infinite’s networking problems that persist for more than a year.
Microsoft? Fix bugs? Are we thinking of the same microsoft? Best they can do is integrate a weather app nobody asked for. And install candy crush without your consent.
Here are some numbers. I’m at 42 hours played. Resolution 2k, settings are on High. Ryzen 5800x and a Radeon RX 6800xt. The last session of ~5 hours had an average FPS of 108, Starfield is more optimized than BG:3 and Remnant 2… at least for AMD. I had to lower a lot of Remnant 2 settings and it still averages around 55.
Ony 3080 with a 5900x I’m constantly getting 60fps at 1080p (unfortunately for now that’s the only screen I have), meanwhile BG3 would dip to low 10s after a few minutes of playing every time
EDIT: I would also like to add that I didn’t use DLSS or FSR in both games, since my hardware is more than capable of running both on maximum quality at 60fps 1080p.
That’s exactly what I have, but I play on 3840x1600, 24:10 Ultrawide.
I don’t remember BG3 giving me any problems, even in Act 3, before the last patch, that supposedly addresses some performance problems. I loaded up a save just now and get ~50fps running around in the Lower City (very short test, only like two minutes). That’s with most settings maxed and DLSS Quality.
Depending on the area, I’d probably get similar numbers in Starfield (according to the benchmarks I’ve seen), but for me, it’s a difference playing an FPS or isometric RPG.
I’ve also watched some streams, and the performance hasn’t even been my biggest concern. I’m just… not interested? It hasn’t been gripping me. Even though there are these shiny new things and bells and whistles, it still just looks like another Bethesda game to me, but with a blander setting this time. Though maybe it’s more fun to play than watch. I just haven’t really seen anything that makes me go “goddamn I gotta get a piece of that”.
Replied to wrong level, moving to parent comment.
I heard this is because NVIDIA didn’t fund optimizations. AMD did so it’s running a lot better for them. I said fuck it and bought an Xbox with 2 year payment plan and game pass included. Cause no way my 1080ti is ever gonna play this game that good I don’t think. End of an era.
1080ti could’ve been a solid 30fps experience…
Not what I’m hearing. Maybe after some more patches and a new driver.
My 980 is pulling 30 fps with most things on medium and high, and shadows on low because it has the most effect on performance. Also turn off resolution scaling, for nvidia at least, don’t know about AMD cards.
With resolution scaling it doesn’t matter if you’re using AMD or Nvidia, it’s doing the same thing and looks the same on both vendors.
If your GPU supports it (RTX cards), you can mod DLSS into the game and then get (supposedly) better image quality, on the same level of scaling as the non-modded FSR2, or potentially lowering the scaling even more, for better performance, while still getting a comparable image as a higher FSR2 preset.
Wow! Good job getting your 980 to run this! Gives me hope for playing on my PC with cross save in the future!!
It’s all over the place. Some AMD GPUs are far better than the equivalent Nvidia GPUs, but then AMD CPUs are seemingly much worse than Intel.
Then there’s reports of Nvidia cards sometimes being stuck at like 60% power, which of course doesn’t help either.
Damn what a mess. I was hearing for Linux the mesa driver makes the game run pretty well for AMD cards.
It’s technically sponsored by AMD, which is why you
couldcan get it free with a processor or GPU upgrade.Edit: could to can, the offer is open until October apparently
AMD folks are having a good time, but nvidia folks will need to wait. The game is purposefully not optimized for nvidia at the moment due to AMD sponsorship. (Also potentially to point out that many AAA titles tend to be optimized for nvidia but not AMD at launch)
That doesn’t explain the CPUs though, since with those, AMD is much worse than Intel, so it’s not just a simple “game is optimized for AMD.”
Simply untrue with later AMD. Slight advantage to Intel, but not the blowout it used to be. Intel loses entirely if power consumption and cost is taken into account.
But of course, games rely largely on GPU power, and the CPU concern is generally secondary.
In Starfield the 13900K is 20% better than the best AMD offering, the 7800X3D. Even the 13600K is better than any AMD CPU. A 13100 is on the same level as the 5800X3D. I wouldn’t call that just a slight advantage.
It’s only this game right now, that’s why I’m saying something might be up.
Sauce?
Hardware Unboxed Starfield CPU Benchmark