Jump to content
  • Sign Up

Alright, maybe it is time for an engine upgrade.


Lawtider.9024

Recommended Posts

@"Dawdler.8521" said:That said, interestingly enough there is something weird going on in the graphics pipeline, seen that in the past (I dont do so much PvE now). Boss fights could chug along at 25 fps and then suddenly, out of the blue with little change in visible player activity, it went to smoother 50+... only to drop down again like 5s later as the fight continue.

The flaw in the graphics pipeline is that with DX9, input processing, GPU on/offloading, and IO all happen from within the graphics pipeline.

You can observe this when an asset is taking a while to load, and the map appears empty for a time, then suddenly huge swaths of it populate -- or how at very low FPS (say, <12), skills can fail to go off because the difference between when your OS registers the keystroke and when GW2 attempts to process it is enough to void the buffer.

This is because DX9 is a synchronous runtime -- event processing and rendering cannot occur in separate threads; the context simply won't allow it.

Playing around with DXVK (the DirectX to Vulkan layer), and most of these issues are resolved, outside of GW2's asset loading problems (because in DX9, loading an asset and sending it to the GPU all occurs during the renderpass, effectively "between" frames, and this cannot really be circumvented based on when the engine initiates IO).

(As an aside, I've also observed substantial performance improvements in other titles by simply switching from DX11 to DX12, the latter of which fully supports asynchronous rendering and IO operations, allowing much, much faster loading (and deferred loading) of asset data, as well as the rendering thereof. Keep in mind that this only occurs for titles which fully support the target API; GW2 will only see nominal performance increases unless it were rewritten in such a way that made use of the new API features)

Link to comment
Share on other sites

@fluffdragon.1523 said:

@"Dawdler.8521" said:That said, interestingly enough there is
something
weird going on in the graphics pipeline, seen that in the past (I dont do so much PvE now). Boss fights could chug along at 25 fps and then suddenly, out of the blue with little change in visible player activity, it went to smoother 50+... only to drop down again like 5s later as the fight continue.

The flaw in the graphics pipeline is that with DX9, input processing, GPU on/offloading, and IO all happen from within the graphics pipeline.

You can observe this when an asset is taking a while to load, and the map appears empty for a time, then suddenly huge swaths of it populate -- or how at very low FPS (say, <12), skills can fail to go off because the difference between when your OS registers the keystroke and when GW2 attempts to process it is enough to void the buffer.

This is because DX9 is a synchronous runtime -- event processing and rendering
cannot
occur in separate threads; the context simply won't allow it.

Playing around with DXVK (the DirectX to Vulkan layer), and most of these issues are resolved, outside of GW2's asset loading problems (because in DX9, loading an asset and sending it to the GPU all occurs during the renderpass, effectively "between" frames, and this cannot really be circumvented based on
when
the engine initiates IO).

(As an aside, I've also observed substantial performance improvements in other titles by simply switching from DX11 to DX12, the latter of which fully supports asynchronous rendering and IO operations, allowing much, much faster loading (and deferred loading) of asset data, as well as the rendering thereof. Keep in mind that this
only
occurs for titles which fully support the target API; GW2 will only see nominal performance increases
unless
it were rewritten in such a way that made use of the new API features)
Thats not what I mean. What I mean is that I've seen 50+ man boss moshpits (such as shaman) chug along at "normal" fps (ie dropping in the 30's or lower due to all the player skill effects) and then suddenly, with no real reason, no apparent change in activities because the boss is still up and its still a moshpit with skills all over the screen... it goes ultra smooth for around 5s. Then it instantly drops back to "normal" again.

Link to comment
Share on other sites

@Dawdler.8521 said:

@Dawdler.8521 said:That said, interestingly enough there is
something
weird going on in the graphics pipeline, seen that in the past (I dont do so much PvE now). Boss fights could chug along at 25 fps and then suddenly, out of the blue with little change in visible player activity, it went to smoother 50+... only to drop down again like 5s later as the fight continue.

The flaw in the graphics pipeline is that with DX9, input processing, GPU on/offloading, and IO all happen from within the graphics pipeline.

You can observe this when an asset is taking a while to load, and the map appears empty for a time, then suddenly huge swaths of it populate -- or how at very low FPS (say, <12), skills can fail to go off because the difference between when your OS registers the keystroke and when GW2 attempts to process it is enough to void the buffer.

This is because DX9 is a synchronous runtime -- event processing and rendering
cannot
occur in separate threads; the context simply won't allow it.

Playing around with DXVK (the DirectX to Vulkan layer), and most of these issues are resolved, outside of GW2's asset loading problems (because in DX9, loading an asset and sending it to the GPU all occurs during the renderpass, effectively "between" frames, and this cannot really be circumvented based on
when
the engine initiates IO).

(As an aside, I've also observed substantial performance improvements in other titles by simply switching from DX11 to DX12, the latter of which fully supports asynchronous rendering and IO operations, allowing much, much faster loading (and deferred loading) of asset data, as well as the rendering thereof. Keep in mind that this
only
occurs for titles which fully support the target API; GW2 will only see nominal performance increases
unless
it were rewritten in such a way that made use of the new API features)
Thats not what I mean. What I mean is that I've seen 50+ man boss moshpits (such as shaman) chug along at "normal" fps (ie dropping in the 30's or lower due to all the player skill effects) and then suddenly, with no real reason, no apparent change in activities because the boss is still up and its still a moshpit with skills all over the screen... it goes ultra smooth for around 5s. Then it instantly drops back to "normal" again.

Almost as if it was filling up some cache and then it's handling more information than it can buffert again?

Link to comment
Share on other sites

@"Infusion.7149" said:Well that's not really the fastest CPU in terms of IPC right now, the Ryzen 7nm+ 5800X / 5900X / 5950X are. (see https://www.techspot.com/article/2143-ryzen-5000-ipc-performance/ )

There's something called Amdahl's law so even in a perfect scenario unless most of the code is parallelized (we're talking over 90%) it will be losing per core efficiency past 6 cores. (see https://www.techspot.com/article/998-cpu-performance-amdahls-law/ or the wikipedia article)

Even in practice there's severe diminishing returns as seen in this recent study on WoW's DX12 implementation: https://rk.edu.pl/en/analyzing-world-warcraft-multi-core-scaling/

As you can see the game FPS decreases nearly linearly with decreasing CPU clock frequency with some gains to Dazar'alor. Those charts show that the game is still managed by the main thread working on one core and only in some edge case scenarios when there is more GPU work than other logic it can scale bit better. Single core frequency and efficiency (IPC) are the king while stronger GPU comes into play only if you want better looks after you provided the CPU power to achieve good FPS.

For a less MMO-type game, we have statements such as the following from Ubisoft:

WHAT IT DOES: Dynamic Texture Indexing helps us
reduce CPU overhead
by issuing fewer draw calls (a call to the graphics API to draw an object that will appear on screen). This is accomplished by having the GPU dynamically select the texture used in the shader, instead of binding it by using the CPU. The result is less pressure on the driver, and the
freed CPU cycles can then translate into better CPU performance overall
.WHAT IT DOES: AsyncCompute is a hardware capability that allows us to execute tasks in parallel on the GPU, thus providing more tools and opportunities for better and improved optimization. Since the launch of Siege on consoles, we have been able to utilize AsyncCompute for console players to optimize graphics techniques such as Ambient Occlusion or ScreenSpace Reflection. Graphics Cards previously supported AsyncCompute, however the DX11 API did not allow us to utilize it. With Vulkan it is now possible to do so.EXPECTED RESULT: With Vulkan and dynamic texture indexing, players who are CPU-bound should see better and more consistent frame rates.

What about a CPU bound game like from Stardock?

In the most oversimplified sense, the biggest difference between the two new graphics stacks and DirectX 11 are that both Vulkan and DirectX 12
support multiple threads to send commands to the GPU simultaneously
. GPU multitasking. Hooray. i.e. ID3D12CommandQueue::ExecuteCommandLists (send a bunch of commands and they get handled asynchronously).In DirectX 11, calls to the GPU are handled synchronously. You could end up with a lot of waiting after calling Present(). Don't get me wrong, DX11 is still way better than DirectX 9.
In DX9, the main thread had to call the GPU
.

The real world of game development
Which brings us back to the question: Why didn't Stardock's new games stick with DirectX 11? And the answer is: The performance gain you get from Vulkan or DirectX 12 comes down to the type of game it is.

Case in point: Stardock has DirectX 12 and Vulkan versions of Star Control: Origins.
The performance gain is about 20% over DirectX 11
. The gain is relatively low because, well, it's Star Control. It's not a graphics intensive game (except for certain particle effects on planets which don't benefit much from the new stacks). So
we have to weigh the cost of doubling or tripling our QA compatibility budget with a fairly nominal performance gain
. And even now, we run into driver bugs on DirectX 12 and Vulkan that result in crashes or other problems that we just don't have the budget to investigate.

In the past there was the introduction of the 64 bit client which if I remember correctly is the only drastically major GW2 upgrade client-side that we know of. That update (which was not the official client , it was labeled beta for the longest time) resulted in fewer crashes due to 4GB VRAM + RAM limits. So there are changes that would benefit all, not just the people with the latest hardware.

Due to the fact that both D912pxy and DXVK both have FPS drops, I would hazard a guess it's down to the parallelization of the code that relies on
networking
and also
client-side
sequential calculations of anything from damage , conditions, range, LoS , etc (everything not visual). The "servers" are actually Amazon AWS elastic compute instances , EC2 presumably with the auto scaling functionality.

You can test your connection to AWS servers via sites such as
and check reachability via Amazon directly

See also statement from the Lead Engine Programmer for GW2 5 years ago

GW2 does a lot of processing, and much of it is done on the main thread. That is also where its bottleneck tends to be: The main thread. There are conscious efforts in moving things off the main thread and onto other threads (every now and then a patch goes out that does just this), but due to how multi-threading works it's a non-trivial thing that take a lot of effort to do. In a perfect world, we could say "Hey main thread, give the other threads some stuff to do if you're too busy", but sadly this is not that world.

As for DX9 and 32bit: Moving off of DX9 wouldn't buy us a whole lot performance wise, as all interaction with DirectX is happening on the render thread, which is generally not the bottleneck. Moving from 32-bit to 64-bit also does not really buy us a lot performance-wise. There are some optimizations the compiler is able to do with 64-bit that it can't do otherwise, but the actual FPS gain is minimal at best.

And about crashing on Tequatl: Here's one case where a 64-bit client could actually help. Many of the crashes happening on Tequatl (which are still quite few, mind you) are cause of memory fragmentation. The bigger memory address space of 64-bit apps could help prevent that. This becomes more of a problem the longer you keep your client running.

So stardock had an increase of 20% from dx11 to 12 and thats supposed to mean that gw2 wont have a noricable increase? For a number of ppl 20% is gonna be absolutely noticable and i will be surpised if gw2 only experiences a 20% jump koving from dx9 to 12.

Link to comment
Share on other sites

What everyone wants here to for the game to continue to exist and get upsated for years to come. For that to happen the game needs future proofing. Even if its a 20 to 30 to 40% jump to performance, a robust system to hide and minimize visual clutter or even new lighting tech or a renderer. All these thing would massively improve the feep and quality of the game.

Link to comment
Share on other sites

@Lucio.4190 said:

@Dawdler.8521 said:That said, interestingly enough there is
something
weird going on in the graphics pipeline, seen that in the past (I dont do so much PvE now). Boss fights could chug along at 25 fps and then suddenly, out of the blue with little change in visible player activity, it went to smoother 50+... only to drop down again like 5s later as the fight continue.

The flaw in the graphics pipeline is that with DX9, input processing, GPU on/offloading, and IO all happen from within the graphics pipeline.

You can observe this when an asset is taking a while to load, and the map appears empty for a time, then suddenly huge swaths of it populate -- or how at very low FPS (say, <12), skills can fail to go off because the difference between when your OS registers the keystroke and when GW2 attempts to process it is enough to void the buffer.

This is because DX9 is a synchronous runtime -- event processing and rendering
cannot
occur in separate threads; the context simply won't allow it.

Playing around with DXVK (the DirectX to Vulkan layer), and most of these issues are resolved, outside of GW2's asset loading problems (because in DX9, loading an asset and sending it to the GPU all occurs during the renderpass, effectively "between" frames, and this cannot really be circumvented based on
when
the engine initiates IO).

(As an aside, I've also observed substantial performance improvements in other titles by simply switching from DX11 to DX12, the latter of which fully supports asynchronous rendering and IO operations, allowing much, much faster loading (and deferred loading) of asset data, as well as the rendering thereof. Keep in mind that this
only
occurs for titles which fully support the target API; GW2 will only see nominal performance increases
unless
it were rewritten in such a way that made use of the new API features)
Thats not what I mean. What I mean is that I've seen 50+ man boss moshpits (such as shaman) chug along at "normal" fps (ie dropping in the 30's or lower due to all the player skill effects) and then suddenly, with no real reason, no apparent change in activities because the boss is still up and its still a moshpit with skills all over the screen... it goes ultra smooth for around 5s. Then it instantly drops back to "normal" again.

Almost as if it was filling up some cache and then it's handling more information than it can buffert again?Not really, there is nothing to buffer or cache (cant predict dynamic effects and the players are already there). Skill lag doesnt behave like that at all and since everything is done serverside it wouldnt really be any proccessing issue locally unless its something that actually breaks (ie the game stops doing calculations on something, then it suddenly starts again).

In terms of graphics the only thing causing that massive difference would be model count but then again the game would hardly go "fuck I'm showing 50 players and your fps is tanking now I'm only showing 25 lol jk here's 50 again enjoy the low fps.". And as I said I never noticed anything visually change over the few seconds.

Bleh I just find it to be weird behaviour. Never recorded it nor have cared about it for years.

Link to comment
Share on other sites

Look it's great that you've all put these hypothesis and theory down in the absence of any input from Anet who should be the ones making excuses but when a lot of us can run modern games at 4k and have better performance with them than we do on a game over 10 years old it's time to acknowledge that improvements are a necessity. I mean, when you're reading about 30fps on a close to top of the line rig then hearing arguments against saying that's not the bestest best best rig and that's great. That's not moving goalposts that's changing the sport.

I Just got PoF & the other xpac and it's running appallingly. It's like I'm running a Voodoo 2. Anet should be ashamed.

Link to comment
Share on other sites

@"Paul.4081" said:Look it's great that you've all put these hypothesis and theory down in the absence of any input from Anet who should be the ones making excuses but when a lot of us can run modern games at 4k and have better performance with them than we do on a game over 10 years old it's time to acknowledge that improvements are a necessity. I mean, when you're reading about 30fps on a close to top of the line rig then hearing arguments against saying that's not the bestest best best rig and that's great. That's not moving goalposts that's changing the sport.

I Just got PoF & the other xpac and it's running appallingly. It's like I'm running a Voodoo 2. Anet should be ashamed.

Do they need to put out a press release every time they make a change in the background? They already put out notices when they work on the network.

Which modern MMO games that rely on network I/O of 100+ people perform well for you? At the end of the day you can't treat Arenanet like a company making a single player game.

It's also important to distinguish between 30 min FPS in a CPU bound scenario versus 30 FPS everywhere. Do you really need 144 FPS for openworld?

As an aggregate across all CPUs, with the most common GPU on Steam (the GTX 1060 6GB) gets:48 FPS rating on userbenchmark at 1080p https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/153864.0.High.1080p.050.7 FPS rating on userbenchmark at 1440p https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/153864.0.High.1440p.0

Others:

!107FPS i7-8700K + GTX 1080 Ti, 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/251565.347462.High.1080p.0! 95 FPS i7-6700K+ GTX 970 , 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/14719.32631.High.1080p.0! 93 FPS i7-7700K + GTX 1070 , 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/141989.178732.High.1080p.0! 85.7FPS i7-7700K + GTX 1080 , 1080p "max" https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/137575.178732.0.1080p.0! 85 FPS R5 2600X + GTX 1060 3GB , 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/165268.466081.High.1080p.0! 80 FPS i7-8700K + GTX 1050 Ti , 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/188434.347462.High.1080p.0! 63 FPS i5-4690K + GTX 760 , 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/7650.11612.High.1080p.0! 68 FPS i5-3570+ GTX 1060 , 1440p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/153864.793.0.1440p.0!59.3FPS i5-6500 + GTX 770 , 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/7658.34775.High.1080p.0i!50 FPS Ryzen 5 1600 + GTX 1050 Ti, 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/0.258318.High.1080p.0! 50 FPS i7-4790K + GTX 1070 , 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/141989.11601.High.1080p.0! 50 FPS i5-6600K + GTX 1070, 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/141989.32899.High.1080p.0! 40 FPS i5-4460 + GTX 1050Ti 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/188434.10022.High.0.0! 40 FPS i5-2500K + GTX 1060 6GB , 1080p high https://www.userbenchmark.com/PCGame/FPS-Estimates-Guild-Wars-2/3733/153864.46.High.1080p.0If it runs like trash everywhere you might not even have dx9 installed properly.

If you actually watch the GDC presentation is it stated there were problems even porting GW1 server to 64 bit Windows. What happened is while testing worked, once it was put on virtual machines the paging caused CPU usage spikes undetectable by Windows monitoring due to polling rate. For 32 bit that it was designed on, out of memory results in crashes instead. Can you imagine if they redid the whole engine instead of focusing on low-hanging fruit for the CPU bottlenecks? Most of the existing art assets would be unusable probably because in-house tools used to work with the existing engine would need to be redone.

That's not even including quotes such as these from Bill Freist (now gone from Arenanet):https://www.guildwars2.com/en/news/bill-freist-talks-optimization-and-performance/

At present, Guild Wars 2 has already received a massive amount of optimizations. This means that players who have suffered from poor performance in the past should see an improvement to their frame rate. But there are still discrepancies between individual systems. As we investigate those, it leads us to discover interesting bottlenecks. For example, here’s a chart showing how much of a difference a graphics driver version can make....keep in mind that having an adequate processor, graphics card, and sufficient system memory are all important to having a smooth gameplay experience.

It's a known that it has used Umbra (CPU-side) culling as well:https://blog.umbra3d.com/about-us/press-releases/arenanet-powering-the-world-of-guild-wars-2-with-umbra-3-rendering-optimization

@"zealex.9410" said:So stardock had an increase of 20% from dx11 to 12 and thats supposed to mean that gw2 wont have a noricable increase? For a number of ppl 20% is gonna be absolutely noticable and i will be surpised if gw2 only experiences a 20% jump koving from dx9 to 12.There's a difference between a person putting up a dx9 to dx12 on github and a game company actually rewriting their code and having to support it. Case in point, look at the mac client. Also see the performance differential semi-tabulated a few posts ago between d912pxy and the actual 64 bit client.

The closest thing to a proper implementation is dxvk since that works on multiple platforms and across multiple games , officially endorsed by Steam Play (through funding not just "use this").

Instead of conjecture , people ought to specify :

  1. What CPU they're using and at what clocks (if you're using a 2GHz dual core 4 thread notebook chip don't expect miracles)
  2. How much RAM they have and are they using a pagefile at all (because as stated above, 64 bit doesn't crash due to run out of RAM unless you don't use a pagefile)
  3. Are they using a SSD , is there a pagefile and if pagefile is turned off is there an out of memory error
  4. What ping they have
  5. What are they are having issues with , within reason. Expecting 60 min FPS with 100+ people on highest settings is not realistic when it's CPU bound for all physics and shadows/reflections if they're turned on. For a period LS4 and PoF maps were performing poorly due to networking an not the client.
  6. If settings are turned up , the VRAM usage out of the total GPU VRAM assuming GPU load is not always 100%
  7. How much of a change is actually due to the non-rendering parts (i.e. using d912pxy or dxvk to isolate which bottlenecks aren't related to directx)
  8. Whether CPU bound settings such as model limit, shadows , reflections are on max or low
Link to comment
Share on other sites

@Kirin.7306 said:isn't GW2 the less profitable of all games the main company has? Why would they waste money on a new engine for it? Not saying the shouldn't

The release to Steam could be a reason. There will be lots of new players, but the question is if they get the experience that makes them stay.In my opinion, a new engine and a complete upgrade will increase those chances.

Link to comment
Share on other sites

@Kirin.7306 said:isn't GW2 the less profitable of all games the main company has? Why would they waste money on a new engine for it? Not saying the shouldn't

GW2 is the most profitable game by NCSoft outside Korea and has been since its release. According to the latest quarterly report (Q3 2020) NCSoft made 27385 million krw in NA/EU while Guild Wars 2 earned 18818 million krw so Guild Wars 2 is making more than half the money of NCSoft West. As individual games go, Guild Wars 2 surpassed Aion in Q3 2019 (so last year) while in Q3 2020 Guild Wars 2 also surpassed Blade & Soul in total earnings becoming NCSoft's third best game (in revenue) behind the two Lineage games.

Link to comment
Share on other sites

@zealex.9410 said:Speaking of other ncsoft titles, both l2 and bns are getting engine upgrades. in the case of l2 i think the game is moving from unreal 2.5 to unreal 4 which is a huge leap in visual fidelity and performance.It's not a direct engine upgrade , Blade & Soul Complete is a separate server (Frontier World) and requires a new client. Also there's Blade and Soul 2 coming on mobile not PC.

Lineage 2 M (mobile) is using Unreal Engine 4 , and so did Lineage 2 Revolution. Lineage 2 : Remastered hasn't had any release announcement officially.

There's also announced Aion 2 on mobile.

Everything is coming on mobile because that is where the money is to be made.

You have to realize Unreal Engine 4 requires a 5% royalty.

Link to comment
Share on other sites

@"zealex.9410" said:Speaking of other ncsoft titles, both l2 and bns are getting engine upgrades. in the case of l2 i think the game is moving from unreal 2.5 to unreal 4 which is a huge leap in visual fidelity and performance.

Korean mmos invest a lot in their graphics, they are built to attract audiences that are after highly detailed characters and environments. So for them it makes sense to upgrade their engines, plus they go from one version of unreal engine to another, there is a "porting" process available. Although I wouldn't call Guild Wars 2 graphics bad, graphics is not any kind of reason high on the priority list to start playing it.

Link to comment
Share on other sites

@Infusion.7149 said:

@zealex.9410 said:Speaking of other ncsoft titles, both l2 and bns are getting engine upgrades. in the case of l2 i think the game is moving from unreal 2.5 to unreal 4 which is a huge leap in visual fidelity and performance.It's not a direct engine upgrade , Blade & Soul Complete is a separate server (Frontier World) and requires a new client. Also there's Blade and Soul 2 coming on
mobile
not PC.

Lineage 2 M (mobile) is using Unreal Engine 4 , and so did Lineage 2 Revolution. Lineage 2 : Remastered hasn't had any release announcement officially.

There's also announced Aion 2 on
mobile
.

Everything is coming on
mobile
because that is where the money is to be made.

You have to realize Unreal Engine 4 requires a 5% royalty.I don't get what has to do this with the opinion GW2 needs an upgrade.

Microsoft provides guides to migrate Dx9 to DX12 (not directly but by steps), the new Visual Studio (i think GW engine is done in C++) provide helper libraries and there are also opensource helper libraries to help to translate the code from legacy D3DXX to D3D12.

This should have been started already long time ago with the announcement of PoF. There is no excuse not to implement this as the game is still very profitable.

Link to comment
Share on other sites

@Infusion.7149 said:You have to realize Unreal Engine 4 requires a 5% royalty.

I'm quite certain that 5% royalty is far less than having an in-house engine team working on your game, especially if said engine team is large enough to make big changes to a game, like adding ray tracing, or developing a directx 12 version for the engine. Meanwhile, you get all those upgrades for free by using Unreal Engine. In the end that fee depends on how cutting edge you want a game to be, as developing cutting edge technology on your own would require a lot more money than a 5% royalty fee.

Link to comment
Share on other sites

@maddoctor.2738 said:

@Infusion.7149 said:You have to realize Unreal Engine 4 requires a 5% royalty.

I'm quite certain that 5% royalty is far less than having an in-house engine team working on your game, especially if said engine team is large enough to make big changes to a game, like adding ray tracing, or developing a directx 12 version for the engine. Meanwhile, you get all those upgrades for free by using Unreal Engine. In the end that fee depends on how cutting edge you want a game to be, as developing cutting edge technology on your own would require a lot more money than a 5% royalty fee.

The thing is peeps are asking for an engine upgrade not because of prettier graphics, Gw2 looks ok for an MMO. Peeps as asking for an engine upgrade looking for better performance so they can play in big events with a normal machine at decent FPS.

Link to comment
Share on other sites

@anduriell.6280 said:

@Infusion.7149 said:You have to realize Unreal Engine 4 requires a 5% royalty.

I'm quite certain that 5% royalty is far less than having an in-house engine team working on your game, especially if said engine team is large enough to make big changes to a game, like adding ray tracing, or developing a directx 12 version for the engine. Meanwhile, you get all those upgrades for free by using Unreal Engine. In the end that fee depends on how cutting edge you want a game to be, as developing cutting edge technology on your own would require a lot more money than a 5% royalty fee.

The thing is peeps are asking for an engine upgrade not because of prettier graphics, Gw2 looks ok for an MMO. Peeps as asking for an engine upgrade looking for better performance so they can play in big events with a normal machine at decent FPS.

Indeed, though in the currently unlikely event an engine upgrade does happen a few graphical tweaks would be a good thing, as well as an option to turn down other players skill animations and particle effects.

Link to comment
Share on other sites

@"anduriell.6280" said:I don't get what has to do this with the opinion GW2 needs an upgrade.

Microsoft provides guides to migrate Dx9 to DX12 (not directly but by steps), the new Visual Studio (i think GW engine is done in C++) provide helper libraries and there are also opensource helper libraries to help to translate the code from legacy D3DXX to D3D12.

This should have been started already long time ago with the announcement of PoF. There is no excuse not to implement this as the game is still very profitable.

If they upgrade the engine it would likely need to branch into mobile in the long run since that's where the money is which means likely not DX12 (where iOS is supported via MoltenVK on Metal ; also Vulkan allows Google Stadia support). People are assuming it is stock Unreal engine so that it would be an easy port, when we know it is heavily modified. Porting over without any reimplementation of memory management would not result in any appreciable gains over the add-ons people use and any in-house tools that are required to make content would need to be re-tooled (assuming that is even possible , considering some tools may not be in-house *).See https://docs.unrealengine.com/en-US/SharingAndReleasing/Mobile/Android/VulkanMobileRenderer/index.html , https://docs.unrealengine.com/en-US/SharingAndReleasing/Mobile/iOS/DeviceCompatibility/index.html

Leveraging the same consumer base is not going to bring much more money in to fund something that massive, which is why a Steam launch would need to be a massive success. The performance benefit between Vulkan and DX12 is such that either way there would probably need to be massive changes if there are features implemented that actually matter , namely async compute.

Also example Basemark GPU scores:

! RTX 3090 Vulkan = 21.5K vs 20.3K on DX12! RTX 3080 Vulkan = 18 vs 17.1K! RX 6900 Vulkan = 17.3K vs 18.2K! RX 6800 Vulkan = 13.8K vs 14.6K! RTX 3070 Vulkan = 13.4K vs 12.7K! RTX 3060 Ti Vulkan = 12K vs ~11.4K! RTX 2080 Vulkan = ~11.4K vs ~10.2K! GTX 1080 Ti Vulkan = 10.7K vs 10K! RX 5700 XT Vulkan = 8.4K vs 8.7K on DX12! RTX 2060 Vulkan = 7.7K vs 7K! RX 5600 XT Vulkan = 6.4K vs ~6.6K! RX Vega 56 Vulkan = 6.2K vs 6.5K! GTX 1070 Vulkan = 6.3K vs 5.9K! GTX 1650 Super on Vulkan = ~4.9K vs 4.2K

Similarly , nobody has actually posted concrete evidence that the d912pxy / dxvk workaround has eliminated all frame drop problems , even if it has improved performance. It is deeper than just the render thread. We know that there has been work to improve multithreading but some things are inherently sequential. If only 50% of code is parallelized the core usage past 2 cores will result in negligible gains ; for 75% parallelized code only up to 4 cores even if it is on a new graphics API.

@maddoctor.2738 said:

@"Infusion.7149" said:You have to realize Unreal Engine 4 requires a 5% royalty.

I'm quite certain that 5% royalty is far less than having an in-house engine team working on your game, especially if said engine team is large enough to make big changes to a game, like adding ray tracing, or developing a directx 12 version for the engine. Meanwhile, you get all those upgrades for free by using Unreal Engine. In the end that fee depends on how cutting edge you want a game to be, as developing cutting edge technology on your own would require a lot more money than a 5% royalty fee.

People are asking for Arenanet to upgrade the engine, dump money into it, and then possibly pay a 5% royalty because they feel it isn't performing well enough. Many MMOs use older Unreal Engine versions because there is no royalty.


Link to comment
Share on other sites

  • 2 months later...

@Dawdler.8521 said:

@Dawdler.8521 said:That said, interestingly enough there is
something
weird going on in the graphics pipeline, seen that in the past (I dont do so much PvE now). Boss fights could chug along at 25 fps and then suddenly, out of the blue with little change in visible player activity, it went to smoother 50+... only to drop down again like 5s later as the fight continue.

The flaw in the graphics pipeline is that with DX9, input processing, GPU on/offloading, and IO all happen from within the graphics pipeline.

You can observe this when an asset is taking a while to load, and the map appears empty for a time, then suddenly huge swaths of it populate -- or how at very low FPS (say, <12), skills can fail to go off because the difference between when your OS registers the keystroke and when GW2 attempts to process it is enough to void the buffer.

This is because DX9 is a synchronous runtime -- event processing and rendering
cannot
occur in separate threads; the context simply won't allow it.

Playing around with DXVK (the DirectX to Vulkan layer), and most of these issues are resolved, outside of GW2's asset loading problems (because in DX9, loading an asset and sending it to the GPU all occurs during the renderpass, effectively "between" frames, and this cannot really be circumvented based on
when
the engine initiates IO).

(As an aside, I've also observed substantial performance improvements in other titles by simply switching from DX11 to DX12, the latter of which fully supports asynchronous rendering and IO operations, allowing much, much faster loading (and deferred loading) of asset data, as well as the rendering thereof. Keep in mind that this
only
occurs for titles which fully support the target API; GW2 will only see nominal performance increases
unless
it were rewritten in such a way that made use of the new API features)
Thats not what I mean. What I mean is that I've seen 50+ man boss moshpits (such as shaman) chug along at "normal" fps (ie dropping in the 30's or lower due to all the player skill effects) and then suddenly, with no real reason, no apparent change in activities because the boss is still up and its still a moshpit with skills all over the screen... it goes ultra smooth for around 5s. Then it instantly drops back to "normal" again.

I only go back to playing after the engine is changed

  • Confused 1
  • Sad 1
Link to comment
Share on other sites

@maddoctor.2738 said:

@"zealex.9410" said:Speaking of other ncsoft titles, both l2 and bns are getting engine upgrades. in the case of l2 i think the game is moving from unreal 2.5 to unreal 4 which is a huge leap in visual fidelity and performance.

Korean mmos invest a lot in their graphics, they are built to attract audiences that are after highly detailed characters and environments. So for them it makes sense to upgrade their engines, plus they go from one version of unreal engine to another, there is a "porting" process available. Although I wouldn't call Guild Wars 2 graphics bad, graphics is not any kind of reason high on the priority list to start playing it.

I think this is an important point that isn't mentioned enough. Many other games (in particular the Korean ones) that successfully undergo remastering have an audience that generally places much higher importance on appearances, and do not often combine massive zergs with a huge diversity of eye-stabbing effects.

And guess what? Even in titles that are hugely defined just by how nice they look, the devs often don't even try to address performance issues with zergs. Case in point: BDO is sometimes held up as an example of a game that benefited hugely from a remastering. It certainly does look very nice, and arguably that nice look makes up a huge portion of what defines the game itself. However, even in a game so strongly defined by its sexy appearance and smooth performance, there is absolutely no effort to optimize zerg performance. The accepted answer is: "want to play large scale? accept potato." BDO literally has a potato setting for large scale pvp. In fact, it has at 3 separate developer-coded potato presets for that express purpose.

GW2 not only has zergs in WvW, but also all over PvE as well. Frankly the fact that some of us (myself included) can keep massive events at ~25 FPS with max graphics and relatively minimal stuttering is something of a miracle in my eyes.

I don't play a huge variety of games right now, so I'm not sure if I'm missing any examples of a game that looks at least as good as GW2 and performs consistently and markedly better than GW2 does in the FPS-killing events. I'd be happy to hear about some, but I suspect that those titles would still have significant differences from GW2 in terms of baseline graphical quality and the centrality of zerg events.

Link to comment
Share on other sites

Upgrading the engine could be nice for receiving advanced graphics technologies for the client. But I have to wonder if it could bring advancements to content development. If they incorporate new rendering methodologies or techniques, could they generally improve their content pipeline? If so, then I think engine modernization would be a fruitful goal. Even if not, DX9 can only be supported for so long right? Can you really justify making new content to a DX9 game in the long term?

Link to comment
Share on other sites

  • 1 month later...
On 12/26/2020 at 6:33 AM, The Greyhawk.9107 said:

Questionable whether Anet could pull this off and still put out at least some content. Not strictly against this but I'll not hold my breath.

-------------------

Edited by Crawford.4135
  • Like 2
  • Haha 1
Link to comment
Share on other sites

Fps problem is related to network code of this game. To solve it, Anet has to find a workaround to simplify and reduce information flow being sent to server and to clients on same event/area. Then Render thread will get updates faster and permitted to draw more thus increasing overall fps during crowded events. 

 

If directx or render thread was the root cause of low fps (as many of you think) then you would not have 200fps max settings when alone either. It could not pull that off at all.

Edit: While directx/render thread can be problematic, let me rephrase it's not the root of cause for fps loss.

Edited by fatihso.7258
Missing word.
Link to comment
Share on other sites

8 hours ago, fatihso.7258 said:

If directx or render thread was the root cause of low fps (as many of you think) then you would not have 200fps max settings when alone either. It could not pull that off at all.

What? That doesn't follow at all.  If the current rendering system doesn't efficiently handle certain things, of course it could still be the problem when many of those things aren't on screen with you at once.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...