Jump to content
  • Sign Up

Recommended CPU and GPU for 4k-resolution at around 40 FPS


Recommended Posts

On 9/2/2023 at 6:45 PM, Martin Ross.7283 said:

i3-9100F,  GTX 1050 Ti

LMAO.

RTX 3070 Ti, RX 6800 at least, for this.  Otherwise, it isn't even worth getting a 4K display.

With a 1050 Ti, you are far better off running on a 1080p display since that will not tax that weak sauce GPU much, and will allow your CPU to carry it.

I find this game scales horribly with GPU compute capacity, anyways, so YMMV.

There is no point running a game at 4K Resolution if you are forced to turn down everything to bare minimum just to get playable frame rates.  1080p is going to be playable at higher framerates while looking better. 

I personally wouldn't even bother trying to game in 4K with anything less than an RTX 3080/RX 6900.

Edited by Tren.5120
Link to comment
Share on other sites

38 minutes ago, fatihso.7258 said:

If you want to keep using your rams 5800x3d, if you want a new system that will last a long time and future proof then go for a 7800x3d. Keep the 1050ti for now and see how it does on 4k with new cpu. 

For 4K gaming it doesn't matter what CPU he puts in the machine, a 1050 Ti is completely out of line with what is appropriate.  It's slow, has a narrow bus, and a small/slow VRAM buffer.  It was designed for casual gaming at 768/1080p.  It's a low-end card from 2017.

Gaming at 4K is a lot more GPU-bound than gaming at 1080p, so you need more GPU grunt there even though GW2's engine (like most other older MMORPG engines) is a CPU hog.  Otherwise, what point is there playing at minimal settings at UHD trying to get to playable framerates if it looks far worse than FHD at higher settings with higher consistent framerates.  He will never reach a playable framerate with that setup... even if it shots higher peaks, I can guarantee that it's constantly dropping to the point that the average is still going to be well-below playable, anyways.

You can put a Threadripper in that machine, and he will still get bad performance because that GPU is too weak.

Beyond that, going from an i3-9100F to any AMD CPU means a complete platform swap.  He will not only have to replace the CPU, but the MOBO as well.  Upgrading to a newer Intel CPU will also necessitate a platform swap, as Intel is not as generous with maintaining sockets across CPU generations as AMD has been.

5800X3d only makes sense for dedicated gaming rigs that you don't use heavily for other tasks like productivity, video editing, etc.  Otherwise, it may not be worth it over a Ryzen 9-5900/5950X - both of which will trounce it for Video Editing, Heavy Photography and Graphics Design, VFX, Scientific Computing, etc.  Casual gamers need to think long and hard about getting those CPUs over the others with more Cores/Threads.  For them, it generally is not worth it.

An i7 would be fine for 60 FPS 4K gaming, as long as he had a suitable GPU and PSU... but I think he should keep a 1080p screen hooked up to that computer and run his games on that, instead... waiting until he replaces the entire machine to game at 3K.  None of the components in that machine should be carried over.  That CPU only supports DDR4-2400, so it's also possible that he may have slower DDR4 RAM in that machine (so it won't even optimize a Ryzen build).

Generally, for GPU he'd want something like a RTX 3080 Ti/RX 6800 XT - particularly if he plays other games that are built on engines designed to take better advantage of the GPU.

If he's willing to drop back to QHD, then an RX 5700 XT or RTX 2070 SUPER would bother deliver decent performance, assuming he had a suitable CPU to match with them (at least a 10th Gen i5/i7 (or OC'd 9600K) or Ryzen 3600/3700X) and was willing to drop some of the more GPU-abusing settings down a notch - like Shadows, Post Processing, Depth of Field, Ally Effects, etc. (the game would still look good).

It really doesn't matter that a weak sauce PC peaks at 40 FPS if it's dropping constantly down to the teens.  The goal is to narrow the corridor of variability in your framerates, so that performance stays as close as possible to the average.

Also, running with CPU/GPU constantly at max load is just going to result in a very loud computer.  Well, that may bother people to different degrees 😛 

Edited by Tren.5120
  • Like 1
  • Confused 1
Link to comment
Share on other sites

11 hours ago, Martin Ross.7283 said:

I simply changed the resolution from 1920 x 1080 to 3840 × 2160 leaving all other graphics settings the same (close to max) and my fps went from 60 to 20. 

The display resolution is 100% tied to the GPU. If that's all you have changed and you were happy with your fps before, then focus on upgrading the GPU.

It's been a long time since I had a GPU, that had issues achieving 4k/40fps in GW2. If I remember correctly even my  RX 590 (which is still a lot faster than your 1050 Ti) in 2019 could run the game at 4K/60. Any current 200 dollar GPU should be fine.

 

3 hours ago, Tren.5120 said:

For 4K gaming...

He didn't ask for 2023 triple A title 4K gaming or productivity tasks. He asked specifically for GW2 @ 4K/40 fps. Why are you trying to upsell here?

Recommending 1000+ dollar hardware to someone who just wants to play his 2012 MMO on his new screen at a relatively low framerate is a bit offtopic to say the least.

Edited by KrHome.1920
  • Thanks 1
  • Confused 1
Link to comment
Share on other sites

9 hours ago, KrHome.1920 said:

He didn't ask for 2023 triple A title 4K gaming or productivity tasks. He asked specifically for GW2 @ 4K/40 fps. Why are you trying to upsell here?

GW2 taxes the CPU in my machine more than Cyberpunk 2077.  It's an MMORPG, it has high CPU requirements and the i3 in his machine is going to be a bottleneck.  The engine actually does not scale as well with GPUs as newer games, so it absolutley is a consideration.  The bottleneck shifts to GPU at 4K, but that doesn't mean requirements for the CPU remain the same.

Nothing I recommended is an upsell.  It is reality.  OP did ask:

On 9/2/2023 at 6:45 PM, Martin Ross.7283 said:

Any recommendations on a CPU and GPU that could get me to around 40 FPS?

And I replied with realistic recommendations for that.

At least a 5700 XT, and I'd definitely upgrade the CPU as that will bottleneck, as well.  An RX590 makes no sense given the cost of 5700 XTs and how much better they perform.  You're going to be saving 20-25% in cost for like a 60+% decrease in GPU performance.  It's a brainless recommendation fromboth a performance and economics standpoint.  RX590 could run 4K games because of the 8GB VRAM Buffer, but the performance is still bad because the GPU is basically the equivalent of a 1060.

The idea that you can upscale to 4K and basically maintain the same performance just by virtue of having a bigger VRAM buffer is also... curious, at best.  That's not quite how it works.

I still think that machine will struggle to maintain 40 FPS on anything but minimal settings (which look AWFUL) in GW2, given my experience running a 3700X with a 5700XT (not my current rig, but easy to swap the components back in to check - that CPU has comparable performance for gaming).  That combination was struggling to maintain 60 FPS at 1080p with Medium Settings in GW2, outside of fairly barren cities.  It would almost certainly drop into the 30s-40s while doing rifts or META events - down into the teens if you have ally effects on or there were a lot of people participating on screen.

AM4 boards are dirt cheap and so are Ryzen 7 5700/5800 CPUs.  (Anything below a 3700X is going to be a downgrade for gaming.) That RX GPU is very cheap, and almost certainly the best budget option available for someone who is cash-strapped.  I didn't know recommending budget hardware was a toxic upsell when someone lists those requirements.  The 7000 series CPU someone else mentioned cost as much as all the stuff I told him to get when you factor in the MOBO and Cooler needed to keep it in check, kitten.

If he plans to upgrade the PC in the not-distant future, then he should not bother with 4K and stay on 1080p and spec the new machine appropriately.  No point spending money in old components with a limited viability span.  My 5700 XT is in the closet.  I did not stick with it for 4K gaming because unless I was willing to run literally everything at low settings it performed terribly.  It doesn't make sense to go to 4K if you're going to be running everything on minimal settings.  It won't look better than 1080p.  You lose more than you gain from that.

Edited by Tren.5120
  • Like 1
Link to comment
Share on other sites

i5-9600K:  $229 (No MOBO Upgrade Required)

RX 5700XT:  $199 (RX590 $149 Good Brand/Dual Fan... so brainless to get this over the 5700 XT given the performance disparity)

New PSU:  ~$100 (Debatable:  Depends on what is currently in the machine)

$428-528 (Probably less, if used components are acquired)

Not Quite $1,000

Edited by Tren.5120
  • Thanks 1
Link to comment
Share on other sites

The 1050 Ti and a lot of these older cards can drive 4k resolution monitors, but that typically meant streaming 4K multimedia. (Your display cable also affects this.)

When you talk about *gaming* at 4K, the card now has to be involved in rendering the 4K frames and depends upon the performance of the rest of the rendering pipeline like the CPU.  At minimum, a 1080 would be preferable for this over a 1050 Ti, but even that is pushing it.

That difference between streaming and rendering can confuse people.
 

Edited by Chaba.5410
Link to comment
Share on other sites

3 hours ago, Tren.5120 said:

i5-9600K:  $229 (No MOBO Upgrade Required)

RX 5700XT:  $199 (RX590 $149 Good Brand/Dual Fan... so brainless to get this over the 5700 XT given the performance disparity)

New PSU:  ~$100 (Debatable:  Depends on what is currently in the machine)

$428-528 (Probably less, if used components are acquired)

Not Quite $1,000

A new cooler is probable needed, too. 9600K TDP is much higher than 9100F.

And, just as a personal opinion, for just about +100-120$ you can get a far better 13th gen CPU with a new MOBO.

Link to comment
Share on other sites

7 hours ago, Derdhal.6908 said:

A new cooler is probable needed, too. 9600K TDP is much higher than 9100F.

And, just as a personal opinion, for just about +100-120$ you can get a far better 13th gen CPU with a new MOBO.

I didn't look I to that since switching a MOBO is a bit more intense (some aren't comfortable doing it). But yeah, it's really not that expensive now that we're in 2023 and CPUs have been pretty OP for a while. IMO, the GPU is ultimately the bigger question mark. 

4K sounds good, but it is generally not worth if the games look but ugly due to performance limitations at the hardware level 😞

But that's why I made suggestions at both the budget and higher pricing segments.  It Is doable with fairly low costs, but whether it's worth crawling there over saving that cash for a whole new system upgrade later is the bigger question. 

PERSONALLY, I'd save for later and just stick with 1080p for the moment. 

I think a AIO is pretty standard for gaming rigs. Even a 120 will outperform a stock cooling fan (don't even think Intel bundles those, but that is the case for the stock AMD Wraith coolers). 

Edited by Tren.5120
  • Sad 1
Link to comment
Share on other sites

Thanks for all the replies and recommendations.

My CPU is at about 50% load and GPU at almost 100% load when playing GW2 4K, so I figure a GPU-upgrade will be the most logical upgrade to start with. I do not think I am gonna upgrade the whole system right now.

I am torn between a RX 5700 and a RTX 2060. They seem equally strong, but RX 5700 seemingly had some problems running GW2, at least four years ago (before the games DX11-upgrade?). Can anyone comment on if one of these cards will be better suited for GW2?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...