Ryzen upgrade to 3700X - worth it? — Guild Wars 2 Forums

Ryzen upgrade to 3700X - worth it?

So I have a 2600X and was thinking about upgrading to a 3700X.
That's a big jump in single core performance, nearly 40%. Wondering if it would make enough of a difference in GW to justify the upgrade tho...
Anyone did something similar, would love any kind of input :3

Comments

  • Infusion.7149Infusion.7149 Member ✭✭✭✭
    edited January 31, 2020

    Have you seen this thread posted earlier : https://en-forum.guildwars2.com/discussion/82134/compare-ryzen-2600-ryzen-3600x-with-a-little-benchmark-test-with-d912

    Depends on if you actually play anything other than GW2 I think , or if you intend to stream or encode videos. The single core improvement is not 40% , it's more in line with10-15% before clockspeed differences. It's the multicore that gains more due to the jump from 6 to 8 physical cores with higher L3 cache size (improves latency-bound apps).

    For example Cinebench R20 single thread is 420 for R5 2600X while it's 500 for the Ryzen 7 3700X = 19% due to higher clockspeeds , multicore jumps from 3028 to 4824 which is +59%
    For games such as Shadow of the Tomb Raider (DX12) , 1% lows are around 72 FPS for the R7 3700X and the R5 2600X gets ~61 FPS so the difference is ~18%
    For a DX12 title such as Division 2 , 1% lows are around 86 for the R5 2600X and 107 FPS for the R7 3700X meaning the difference is ~24% at 1080p with a RTX 2080 Ti
    In World War Z (engine is based on Vulkan) , 1% lows are around 132 for the R5 2600X while the R7 3700X gets 140 which means +6%
    In a game such as Battlefield V (DX11) you might see 107FPS 1% lows for the R7 3700X while 98 for the R5 2600X which is about an additional 9% and this is with a RTX 2080 Ti at 1080p so if you have anything lower in GPU or a higher resolution the difference is smaller

    Unless you use d9vk (which rolled into dxvk) or d9d12pxy (however you spell that) I highly doubt any of the extra cores will be of much use because directx9 is inherently singlethreaded in nature and was designed in an era where dualcore and quadcore was the main CPU type and Windows XP was the norm. Nvidia drivers generally are better with respect to multithreading directx 9 and directx11, but as stated you really need d9vk to make full use of extra physical CPU cores.

    see https://software.intel.com/en-us/articles/understanding-directx-multithreaded-rendering-performance-by-experiments

  • Tony.8659Tony.8659 Member ✭✭

    @Friday.7864 said:
    So I have a 2600X and was thinking about upgrading to a 3700X.
    That's a big jump in single core performance, nearly 40%. Wondering if it would make enough of a difference in GW to justify the upgrade tho...
    Anyone did something similar, would love any kind of input :3

    Hello, actually I upgraded from a 2700x to a 3700x this past Nov. I can tell you yes there is a difference. Some areas you gain 10-15fps, but when there is a world boss or in WvW your FPS will still take a hit. Though not as bad as when I had my 2700x but it can still dip into the 20s, depends on how many people are in that one area with you. I'm very happy with the 3700x upgrade well worth it. My specs are: ASRock x470, 3700x, 16GB ram 3200mhz CL14, RTX 2070 Super, Alienware AW2518H 240hz.

  • BRNBRITO.9624BRNBRITO.9624 Member ✭✭
    edited February 2, 2020

    Performance boost should be pretty noticeable in the 10-20% range, though if it's worth that's up to you.

    Also i'd recommend not cheaping out on RAM if you're playing on 1080p, it can also have a noticeable impact in min/avg FPS, preferably check a video so you can have an idea of how impactful it is, i often see people getting Ryzen with 2400/2666 just to save a few bucks.

    Personally i've upgraded from a i7 4790k (not OC'd) to a 3700x and it was night and day, though i have other uses which capped me pretty hard with a 4/8 CPU, can't tell you how much of a % on min/avg it had but it does feel way better, as long as i'm not on a zerg with 100 people and running model limit on highest the performance is great.

    For stream/recording personally i'd recommend Turing NVENC if available, roughly the same as x264 fast with pretty much 0 impact on performance.

    If interested i could show you how's the performance on a video/stream so you can have an idea. Note that i play 1080p 60fps.
    Specs:
    B450M Steel Legend
    3700x
    3000CL15 RAM running @3600CL16
    1660 super

  • KrHome.1920KrHome.1920 Member ✭✭✭✭

    The poster decribes much less microstuttering after the upgrade, which is (in that specific case!) not a result of the processor change but of the ram upgrade from 2400 to 3200.

  • KrHome.1920KrHome.1920 Member ✭✭✭✭
    edited February 2, 2020

    The poster decribes much less microstuttering after the upgrade, which is (in that specific case!) not a result of the processor change but of the ram upgrade from 2400 to 3200.

    Unless you use d9vk (which rolled into dxvk) or d9d12pxy (however you spell that) I highly doubt any of the extra cores will be of much use because directx9 is inherently singlethreaded in nature and was designed in an era where dualcore and quadcore was the main CPU type and Windows XP was the norm. Nvidia drivers generally are better with respect to multithreading directx 9 and directx11, but as stated you really need d9vk to make full use of extra physical CPU cores.

    That's wrong since the multicore capability of DX9 or 11 depends on what workload you have. GW2 can make good use of 6 cores before the mainthread becomes a problem limiting further multicore improvements.

    And besides that dxvk does nothing for your performance. It's a placebo and every single benchmark of it that exists does not show the exact same mass player count scenes (because you can't reproduce them) and so you get different results and do use different quality settings for the game that favor the dxvk benchmark. So these comaprisons are worthless.

    dxvk is a wrapper and a wrapper can never be faster than the native api because it adds additional translation layers wich increase the workload for the CPU. This specific wrapper has its uses in linux because linux does not have native DX9 support.

  • Letsplay.2401Letsplay.2401 Member
    edited February 8, 2020

    For a MMO like GW2, single core performance is the most important . Its a old game engine,running DX9 . The new Ryzen 3000 series made a good jump in single core performance, so really any of those will be fine, but also, here is were a few people will not agree, overclock that cpu if you can, the Ryzen 3K series do overclock some, most of them will overclock to 4200-4300mhz all core, but make sure you have proper cpu cooling before you add cpu voltage, thats because we want performance, but with extra voltage comes extra heat, voltage is not the enemy, extra heat is.

    Also when using Ryzen 3 cpus, make sure you have a decent Ram kit installed to be able to get youre full performance from that cpu. Ramspeed depends, but for good performance, 3200mhz and up is gold for the Ryzen 3 cpus, 3600mhz is kind of optimal.
    The lover the timings are the better.
    ALWAYS run dualchannel on the RYZEN 3 series cpu, single channel will kill the performance alot.

    Some fun fact:
    I read somewere someone asked if the Ryzen 3000 series APU (cpu + igp on a chip) ex. Ryzen 3400G would run GW2 on 1080p.
    To me thats unheard of, running a MMO without a graphics card, nah..
    So, i bought a Ryzen 3 3200g (thats not ZEN 3, but Zen + 12nm cpu- ZEN 2000 series arcitecture)
    The 3200g is basically a four core ZEN 2+ with Vega 8 onboard, a APU.

    Slapped that little sucker on a spare motherboard (Msi B450 Gaming Plus MAX)
    Installed a 3333mhz HyperX 16gb ramkit (2x8gb) on and went to bios to set up first default settings, then later did some overclocking and tested how this cheap APU would run GW2.
    Did it run GW2 on 1080p without a dedicated graphics card? I bet ya...
    Lions Arch- running around low setting ofc, around 28-38fps
    Open world pve - 30-50ish fps
    World boss fight - 15-34ish fps (see attached picture in link)

    But, was GW2 playable and enjoyable playing without a graphics card?
    Somewhat yes, sometimes no
    Because that apu is using "normal" DDR4 system ram, not dedicated GDDR6 or HBM graphics ram, the bandwith of the graphics will suffer of course.
    So the latency will be higher, loading graphics will take somewhat longer, but all in all, it was a nice surprice playing with that cheap Ryzen 3 3200g APU.

    In the end i wanted to max out that APU full potential, so I slapped on a Noctua NH D12S, and added a GTX 1660 Ti i had lying around (I build computers for living part time)
    The 3200g overclocked to 4275mhz (1,375vcore), DDR3400mhz (CL16) and GTX 1660 Ti 2040mhz boost.
    To be honest, GW2 is running like a champ on that computer. Playing on medium/high settings, the game never drops below 25fps at worst in big World boss fight.
    Open world pve anything between 65-100+ fps
    Only four cores, yea I know, but for a budget cpu, its awesome, overclock and you will have up to 10% performance boost for free.
    Im still using this cpu playing GW2, as my secondary rig.

    We all love ZEN/Ryzen

    EDIT,link was wrong
    https://valid.x86.fr/hurkc1