Jump to content
  • Sign Up

When is it time to update from DX9 to something later?


Towelie.9504

Recommended Posts

Higher versions of DX offer improvement in frame rate, higher quality shadows/antilaising/overall image quality and more leverage of using the GPU.

We see even older games than GW2 get updates (WOW BFA brought in DX12), and they really don't need to do it since the fps in that game isn't even terrible. In GW2 it often times is during meta raids and especially WvW.

Is there any roadmap at this time to update from dx9? It's over 15 years old now.

Sorry no idea where else to post this question, it seemed like this was the best fit.

Link to comment
Share on other sites

@Ayrilana.1396 said:Upgrading won’t have much of an impact on performance.

By what degree? The current performance isn't due to CPU or GPU bottle necks, it is clearly about dx9. You can't have players with 8800Ks and 1080TIs experiencing at best 30 fps by drawing only 100ish players on the screen.

Dx9 has an extremely low draw call limit compared to DX11 and DX12DX9's API was well known to be extremely inefficient and heavy, hence why DX10 came out so quickly

Another proof is that Camelot Unchained has already demonstrated 80+ FPS on a 1070 when drawing almost 400 players casting continuously on the same screen and that engine is in alpha testing

It is clearly the engine being the bottleneck.

Link to comment
Share on other sites

Ah gotcha, thank you for the update. So if threading is the source of the problem why is there been very little focus on improving this? I see at any given time there are 30+ threads being used by GW2 but I am assuming that they aren't being used for rendering in any way or something and that's being handled entirelly on the main thread?

Either way, even expanding the draw call limit and lowering the congestion on the API should see dramatic improvements I would expect from DX9 to DX11/12. You go from a draw call limit on each call from 9,000 or something to millions does it not? In addition to that, the overhead on the API is less so there's less cpu utilization being wasted on using it.

Link to comment
Share on other sites

@"Inculpatus cedo.9234" said:Perhaps, reading the Lead Engine Programmer's statements could offer some insight into why changing to DX10,11,12 would not improve FPS, etc.

I think I got the main gist of it here:

Which brings us to GW2. GW2 does a lot of processing, and much of it is done on the main thread. That is also where its bottleneck tends to be: The main thread. There are conscious efforts in moving things off the main thread and onto other threads (every now and then a patch goes out that does just this), but due to how multi-threading works it's a non-trivial thing that take a lot of effort to do. In a perfect world, we could say "Hey main thread, give the other threads some stuff to do if you're too busy", but sadly this is not that world.

As for DX9 and 32bit: Moving off of DX9 wouldn't buy us a whole lot performance wise, as all interaction with DirectX is happening on the render thread, which is generally not the bottleneck. Moving from 32-bit to 64-bit also does not really buy us a lot performance-wise. There are some optimizations the compiler is able to do with 64-bit that it can't do otherwise, but the actual FPS gain is minimal at best.

But please let me know if I am incorrect.

It seems like there's a threading issue in GW2's engine. Not that the problem isn't solvable but that it's very hard to solve. It's a 6 year old game though? And this engine was reused a lot from GW1 wasn't it? When is it going to be time to solve this given that I would expect the next expansion not to be the last?

I would say if a lot of the interaction is on the main thread which is doing a lot of the rendering legwork, wouldn't optimizing the API that main thread is using and increasing the amount of draws being sent to the GPU to get it to do improve this substantially? (since GPUs right now are barely even utilized? My 1070 sits around 35% even in massive zerg fights where my FPS drops below 15.) There's nothing hardware wise people can do much to even solve this other than get higher end CPUs and overclock them and even then your 1070/1080 from 2 years ago are still barely utilized. I'd much rather have a bottleneck issue at the GPU processing what it is getting than on the CPU sending stuff to the GPU as you're hurting the performance for the entire machine by bottle necking on the CPU rather than just the GPU struggling (which makes lowering graphic qualitity items even more beneficial for older machines. Right now older machines and newer high end machines are in the same boat!)

I wouldn't imagine Anet has actually tested with DX10/DX11/12 to see its improvements, and even if the performance improvement are "minimal" that would be better improvement than what we've received for years now wouldn't it? DX9 can't even use some of the new display drivers properly (and when I say new, I mean years old new, not brand new).

Other obvious benefits of using DX10/11/12 is prettier graphics. Don't get me wrong this game is gorgeous given how old it is but why stop there especially if we're expecting an underwater expansion possibly at some point where drawing prettier shadows, better lighting etc would be a very nice QoL update. Is it really necessary for players to have to use third-party renderers to get the game to look modern graphic wise?

Also not trying to come off like I am bashing ArenaNet or any of its developers in any way. I know this is a challenging task but my source of frustration is that graphics lag is a very real problem that is felt daily to a lot of players and there's been very little inertia generated to solving it especially when there are MMOs out there that are out performing at even much more demanding loads (seemingly)

Link to comment
Share on other sites

@Towelie.9504 said:Higher versions of DX offer improvement in frame rate, higher quality shadows/antilaising/overall image quality and more leverage of using the GPU.

We see even older games than GW2 get updates (WOW BFA brought in DX12), and they really don't need to do it since the fps in that game isn't even terrible. In GW2 it often times is during meta raids and especially WvW.

Is there any roadmap at this time to update from dx9? It's over 15 years old now.

Sorry no idea where else to post this question, it seemed like this was the best fit.

Plenty of discussions before, do a search on the forum, the primary bottleneck to this game is CPU, newer rendering engine will do jack .... to improve your fps

migrate to a new engine is too much work because there's already decades worth of complex code

the only solution is build a brand new engine from ground up with GW3

Link to comment
Share on other sites

will a multicore monster like 32 core Ryzen Threadripper (with 64 threads) have any advantage over 4 core 8 thread outdated intel cpu ?

i find CPU matters more than GPU when playing GW2 , especially in crowded zones like Lion Arch .. you can put everything LOW and still hiccuped because you got old cpu

Link to comment
Share on other sites

@milomilome.6837 said:will a multicore monster like 32 core Ryzen Threadripper (with 64 threads) have any advantage over 4 core 8 thread outdated intel cpu ?

i find CPU matters more than GPU when playing GW2 , especially in crowded zones like Lion Arch .. you can put everything LOW and still hiccuped because you got old cpu

To get the best performance out of GW2, you want a CPU with a high per-core performance. Generally, Intels are superior to AMD in this regard.

For example, my old i5-4670K (4 cores, no hyper-threading), even at base clocks, is superior to a AMD Ryzen 7 2700X (8 cores, 16 virtual cores) in performance per thread.

So for GW2, even old Intels are preferable.

Link to comment
Share on other sites

@dace.8019 said:

@"milomilome.6837" said:will a multicore monster like 32 core Ryzen Threadripper (with 64 threads) have any advantage over 4 core 8 thread outdated intel cpu ?

i find CPU matters more than GPU when playing GW2 , especially in crowded zones like Lion Arch .. you can put everything LOW and still hiccuped because you got old cpu

To get the best performance out of GW2, you want a CPU with a high per-core performance. Generally, Intels are superior to AMD in this regard.

For example, my old i5-4670K (4 cores, no hyper-threading), even at base clocks, is superior to a AMD Ryzen 7 2700X (8 cores, 16 virtual cores) in performance per thread.

So for GW2, even old Intels are preferable.


CPU's SPEED is mostly important, not the amount of threads....

Gw2 requires a very fast processor (CPU), preferably with OC (Overclock) to maximize the amount of instructions processed.Due to the fact Intel can proces more instructions per thread /core per cycle (IPC) compared to AMD means Intel tends to be faster per clock tick. HT (Hyperthreading) doesn't help. Hyperthreading is making use of unused clockcycles on a CPU core. If you could use these you can do multicore loads faster, but the single MAINthread of gw2 doens't really benefit, if it needs to wait for anything it tends to lose speed.

The larger the amount of cores in a CPU in general slows the base clock.... This holds true for most processors, The fact Threadripper (12-32 cores with 24-64 threads) uses 2 or 4 connected cpu's tends to spread heat and allow for a bit higher base clock, does not mean this problem is solved, you can see lower end models do run faster.... and since the amount of cores is mostly irrelevant for gw2, we get back to IPC: looking at single core performance intel still outperforms AMD.

Large core counts also do not help when the game uses 1 main thread and 2-4 threads at maximum. You can run GW2 perfectly on any 4 core processor, even an hex core is overkill , but trying to speed it up means trying to get a faster CPU. The CPU makes a real difference. Boosting the clock of my CPU from 3.2 to 4.0 makes 25% difference (I have a 3.2 Ghz CPU with 3.8Ghz boost, running perma 4.0Ghz is not a significant OC tbh. But I have a good MoBo and a HUGE air cooler, AND I pay more for electricity)

Lowering CPU usage can be done by reducing your field of view, lowering the amount of characters shown and lowering shadows and detail. Zooming in already buffs DPS.Some older tricks also help, disabling unnecesary programs and utilities helps when you have low amounts of cores. Disabling Hyperthrading actually does make a minor difference sometimes. OC-ing ram if possible also helps a bit (not much though)


Replacing DX is not an answer... it could be a very small part of a solution.

DirectX mostly is a map of what devices and what routes to the devices are available, so coders do not have to make specific code for specific hardwareThe DirectX package is only to provide acces to easier to uses codes for hardware acces in the program... It translates software commands to codes instructions specifically for any device. It does not improve the original coding by applying a different command upon implementing a different DirectX package as the commands which are used in gw2 are written with DX9 in mind and rescripting the whole engine for DX10/11/12 is mostly trying to figure out how commands should be (re)implemented and basically require a complete rewrite of all code as is now.

An example would be: (not accurate and not based upon the actual code, but to give an idea of the problems)

GW2(original engine) with dx9:Example: you had an old code to sum 2 variables and you needed to sum 4 it would be sum a+b, sum c+d, sum (a+b)+(c+d) and it would be phased to dx9 it would do 3 commands you'd get your value in your apparatus (GPU).

Just putting a DX11 set

GW2(original engine) with DX11:If it can be done in the dx11 package with 1 command for all 4 "sumall (a+b+c+d+)" it's nice... But the game engine still holds the original code doing a+b, sum c+d, sum (a+b)+(c+d) which would be phased to dx11 and it would still execute the same 3 commands and you would gain nothing in the end....

Because the original game engine is not rewritten to take benefit from the ADDED instructions in DX 11 which were not yet available in dx9 and you still have the same value in your apparatus with the same amount of calculations.

Rewriting the engine.....

GW2(remade) with ??? could be a scalable core engine (say 4, with added support up to 8), with any options programmers would like to consider, giving most of the 3D rendering to GPU's would be a huge advantage.Would require 10(0)'s of people to rewrite the engine in many many hours, debug and and reimpose GW2 on the new engine. This could take years... However, it would allow other options, maybe Vulkan or OpenGL could be considered, even Raytracing could be implemented. BUT it requires a full rewite of ALL guild wars 2's engine, to maximize the gains form newer commands.... Would be very nice for consumers and the game and a gain for NCSoft and ArenaNet could be they'll have a modern up to date engine. I doubt however if they could/would ever fund this.

However the basis of the Gw2 engine code we are using now, was written for AGP 4x and 8x Graphics solutions with the occasional PCI-E board with roughly 256 Mb of VRAM, nearly 15 years ago. When DX 8 was still considered pretty advanced.

Link to comment
Share on other sites

@Towelie.9504 said:

@"Inculpatus cedo.9234" said:Perhaps, reading the Lead Engine Programmer's statements could offer some insight into why changing to DX10,11,12 would not improve FPS, etc.

I think I got the main gist of it here:

Which brings us to GW2. GW2 does a lot of processing, and much of it is done on the main thread. That is also where its bottleneck tends to be: The main thread. There are conscious efforts in moving things off the main thread and onto other threads (every now and then a patch goes out that does just this), but due to how multi-threading works it's a non-trivial thing that take a lot of effort to do. In a perfect world, we could say "Hey main thread, give the other threads some stuff to do if you're too busy", but sadly this is not that world.

As for DX9 and 32bit: Moving off of DX9 wouldn't buy us a whole lot performance wise, as all interaction with DirectX is happening on the render thread, which is generally not the bottleneck. Moving from 32-bit to 64-bit also does not really buy us a lot performance-wise. There are some optimizations the compiler is able to do with 64-bit that it can't do otherwise, but the actual FPS gain is minimal at best.

But please let me know if I am incorrect.

It seems like there's a threading issue in GW2's engine. Not that the problem isn't solvable but that it's very hard to solve. It's a 6 year old game though? And this engine was reused a lot from GW1 wasn't it? When is it going to be time to solve this given that I would expect the next expansion not to be the last?

I would say if a lot of the interaction is on the main thread which is doing a lot of the rendering legwork, wouldn't optimizing the API that main thread is using
and
increasing the amount of draws being sent to the GPU to get it to do improve this substantially? (since GPUs right now are barely even utilized? My 1070 sits around 35% even in massive zerg fights where my FPS drops below 15.) There's nothing hardware wise people can do much to even solve this other than get higher end CPUs and overclock them and even then your 1070/1080 from 2 years ago are still barely utilized. I'd much rather have a bottleneck issue at the GPU processing what it is getting than on the CPU sending stuff to the GPU as you're hurting the performance for the entire machine by bottle necking on the CPU rather than just the GPU struggling (which makes lowering graphic qualitity items even more beneficial for older machines. Right now older machines and newer high end machines are in the same boat!)

I wouldn't imagine Anet has actually tested with DX10/DX11/12 to see its improvements, and even if the performance improvement are "minimal" that would be better improvement than what we've received for years now wouldn't it? DX9 can't even use some of the new display drivers properly (and when I say new, I mean years old new, not brand new).

Other obvious benefits of using DX10/11/12 is prettier graphics. Don't get me wrong this game is
gorgeous
given how old it is but why stop there especially if we're expecting an underwater expansion possibly at some point where drawing prettier shadows, better lighting etc would be a very nice QoL update. Is it really necessary for players to have to use third-party renderers to get the game to look modern graphic wise?

Also not trying to come off like I am bashing ArenaNet or any of its developers in any way. I know this is a challenging task but my source of frustration is that graphics lag is a very real problem that is felt daily to a lot of players and there's been very little inertia generated to solving it especially when there are MMOs out there that are out performing at even much more demanding loads (seemingly)

I don't think fully grasp the magnitude in difference there is between an API and an engine architecture. The issue isn't the API, its the circa 1990s monolithic process architecture that most engines from that era are built around.... a time long before Dual core existed on anything other then a high end Server. Theres also a huge reason DX12/Mantle/Vulkan and other "multithreaded renderer" APIs are extremely rare in practice, and often have equivalent, if not not worse performance then those optimized for previous APIs....... real time graphics engines DO NOT like being Async.

In order to get any real advantage with multithreading, your process has be structured around something like a Job system, and has to be able to tolerate asynchronous execution. In previous generations, nearly everything followed a Sequential processing model where you were guaranteed that any instruction that was dependent on data from a previous instruction would have that step completed. In an mutlithreaded model, things have a very high chance of being done out of order, especially when tasks are of different job lengths, and you're not guaranteed that Tasks 1-4 are finished by the time Task 5 is loaded and ready to execute. The higher the interdependence on data between threads, the higher probability a given thread will have to pause and wait for that data to show up.

A Database system can easily get away with multithreading, as the type of job it does doesn't require a long list of instructions to get its output, making them perfect candidates for that type of environment. If you have 1000 retrieve requests, each is a 1 for 1 output, thus only limited by the number of queues the server supports. But that doesn't work well for 3D rendering (games especially), because you need ALL the data to accurately track all the objects and display them. Missing data means missing objects; and the visual nature of real time full motion video means rapid changes and/or inconsistencies catch our attention. For instance.... if a player model was flicking in and out each frame, you would definitely consider that a problem. Now imagine if skills missed randomly 50% of the time because the attack or the target were skipping frames.

The underlying cause of this is the highly time sensitive nature of tick rates, and by extension, frame rates. A game has 16.67ms to do everything it needs to, per frame, in order to meet the 60FPS minimum everyone expects every game to have now. The higher the desired frame rate, the shorter the time window. Now consider 16ms is half the time we consider a good ping (~50ms) to game server; so realistically, a full rendering loop involving the server would only manage 30FPS at best, due to how long it takes get data for every frame. While this is a completely unrealistic hypothetical, it exemplifies what a delay does to a system that demands 100% data integrity to do its job, but also demands low latency for performance.

While a job system that breaks down the tasks so multiple threads can work on them at the same time, if any one thread in a set takes longer then the others, then the main thread has to wait for that ONE thread to finish before moving on to the next step. And it can't simply skip that step while it waits, because that data is extremely likely to be important to the next step in the process. Right now, most games try to work around this bottle neck with "Time Travel". In short, most game lag behind reality a few frames, or a system attempts to speculate on behalf of missing data and tries to mask corrections down the line. But all this really buys you is wiggle room for jitter in latency from task to task, or frame to frame. Staggering your target times and tasking helps improve hardware utilization, but doesn't change the fact that you're still mostly running tasks in sequential order.

But because games have a lot of dependencies, needs pretty much all of it to render correctly, and does a LOT of things that take different amounts of time to execute, even if you were to structure the system as parallel, you still spend inordinate amounts of time waiting on something else to finish before this other thing can start. DX12 and Vulkan only provide a frame work in which you can process tasks in parallel, but your engine has to be designed in a way that can give multiple tasks at the same time, while still making sure each task as the data it needs to run...... and thats hard to do when you need the results of previous tasks to even BEGIN working on later ones. MMOs are among the worst in the industry for this, because its nature heavily weights bottlenecks on the Game State side of the engine; which is extremely dependent on Server processing throughput, game state frequency, network throughput and effective data management throughput on the part of the Engine. The GPU isn't even touched until all of this overhead is completed, and the game state used to layout objects/effects for the rendering pipeline.

Link to comment
Share on other sites

No doubt and I totally agree it isn't just a DX9 to DX11/12 update, it's engine re-writes as well though I think there would be clearly some benefits of going to an API with less overhead built in. Less time it takes for the CPU to send junk to the GPU means more scheduling time to begin on sending more junk to the GPU right? With that, performance gains.

It's no doubt there would have to be changes on the engine side as well to utilize DX11 (properly) but even the base engine should be able to utilize it, and by utilizing a less beefy API you're also letting that DX11/12 to utilize something than old legacy display drivers that have been around since Windows XP which just compounds the issue when using DX9.

I guess the thing that I am pointing at is the core issue moreso than the technical solution to the core issue: If it's an engine rewrite, updating the DX api, etc why aren't these being actively done? I would imagine the next expansion in October will not be the last, and the expansion after that may not be the last as well. How much longer does this problem need to exist before it actually has some inertia to begin to solve it? We see a game older than GW1 and GW2 do these updates and it doesn't even need to (WoW) and we see new games coming out with engines that outperform GW2 by orders of magnitude with only an alpha engine (Camelot Unchained). What are the actions needed to begin some improvement while this game is expected to still be around for another 4-5 years? We say this will take years, this game has been out for years. Why hold off this stuff when it's a problem that is facing numerous players every single day?

I would imagine any work done >now< would have benefits of being reused by a future major game like GW3. If that doesn't happen wouldn't we just be seeing GW3 re-using the same engine mostly from GW2 which was re-using the same engine mostly from GW1? I am assuming this is purely because re-writing the engine to be more optimal means more money needed for engine developers and the net benefit of doing this doesn't represent enough dollar signs?

Link to comment
Share on other sites

The other elephant in the room that everyone forgets is that ArenaNet wants their game to be able to run decently on potatoes...which it currently does, i.e.; I can throw open my 15 year old Toshiba laptop with Win7 and start up GW2 and play it without any problems tomorrow(if I wanted to, not that I would, but I could). That's the other reason for not doing any major changes to the engine, they want it to be able to run on as many computers as possible, even those old dinosaurs.

Link to comment
Share on other sites

@Zaklex.6308 said:The other elephant in the room that everyone forgets is that ArenaNet wants their game to be able to run decently on potatoes...which it currently does, i.e.; I can throw open my 15 year old Toshiba laptop with Win7 and start up GW2 and play it without any problems tomorrow(if I wanted to, not that I would, but I could). That's the other reason for not doing any major changes to the engine, they want it to be able to run on as many computers as possible, even those old dinosaurs.

While that is true, most games that support DX10/11/12 are backwards compatible to Grandma's Windows XP as well. I thought eventually the 32 bit client was going to be dropped anyhow at some point? It doesn't seem like they are overly concerned about removing support for extremely old systems to me, and I would hope that isn't a large priority to them (I bet if they survey the system stats for their playerbase, those Grandma's Windows XP systems are a very small amount).

Link to comment
Share on other sites

@Zaklex.6308 said:The other elephant in the room that everyone forgets is that ArenaNet wants their game to be able to run decently on potatoes...which it currently does, i.e.; I can throw open my 15 year old Toshiba laptop with Win7 and start up GW2 and play it without any problems tomorrow(if I wanted to, not that I would, but I could). That's the other reason for not doing any major changes to the engine, they want it to be able to run on as many computers as possible, even those old dinosaurs.

That strikes me as mostly irrelevant. People stuck on single-core CPUs or DX9 video cards and yet still feel inclined to play GW2 at minimum settings would be a tiny minority of the player base and likely not contribute much of anything to Anet's revenue streams. The kind of engine technology being discussed here is so old that nearly every hardware profile presently in use, even machines long relegated to potato status, would stand to benefit from a software upgrade.

Link to comment
Share on other sites

@Leablo.2651 said:That strikes me as mostly irrelevant. People stuck on single-core CPUs or DX9 video cards and yet still feel inclined to play GW2 at minimum settings would be a tiny minority of the player base and likely not contribute much of anything to Anet's revenue streams.You'd be surprised at how many people are still running XP.

The kind of engine technology being discussed here is so old that nearly every hardware profile presently in use, even machines long relegated to potato status, would stand to benefit from a software upgrade.Hardly anyone is saying there's no benefit. The question is: how much benefit and what will it cost? ANet (who is more likely to know than any of us) says limited benefit and major cost, which is usually results in a no-brainer business decision to put it off.

Link to comment
Share on other sites

@Illconceived Was Na.9781 said:

@Leablo.2651 said:That strikes me as mostly irrelevant. People stuck on single-core CPUs or DX9 video cards and yet still feel inclined to play GW2 at minimum settings would be a tiny minority of the player base and likely not contribute much of anything to Anet's revenue streams.You'd be surprised at how many people are still running XP.

It's 0.2% according to the Steam survey. If you have a better source for gaming profiles feel free to share it.

The kind of engine technology being discussed here is so old that nearly every hardware profile presently in use, even machines long relegated to potato status, would stand to benefit from a software upgrade.Hardly anyone is saying there's no benefit. The question is: how much benefit and what will it cost? ANet (who is more likely to know than any of us) says limited benefit and major cost, which is usually results in a no-brainer business decision to put it off.

No, that wasn't the question.

Link to comment
Share on other sites

@Leablo.2651 said:

@Leablo.2651 said:The kind of engine technology being discussed here is so old that nearly every hardware profile presently in use, even machines long relegated to potato status, would stand to benefit from a software upgrade.@Illconceived Was Na.9781 said:Hardly anyone is saying there's no benefit. The question is: how much benefit and what will it cost? ANet (who is more likely to know than any of us) says limited benefit and major cost, which is usually results in a no-brainer business decision to put it off.

No, that wasn't the question.

What question do you think a business like ANet should be asking then? As customers, we reasonably expect games to keep up with new tech to make them run smoother, prettier, faster. But a business has to ask about how to pay for it.

Link to comment
Share on other sites

@Towelie.9504 said:

@Zaklex.6308 said:The other elephant in the room that everyone forgets is that ArenaNet wants their game to be able to run decently on potatoes...which it currently does, i.e.; I can throw open my 15 year old Toshiba laptop with Win7 and start up GW2 and play it without any problems tomorrow(if I wanted to, not that I would, but I could). That's the other reason for not doing any major changes to the engine, they want it to be able to run on as many computers as possible, even those old dinosaurs.

While that is true, most games that support DX10/11/12 are backwards compatible to Grandma's Windows XP as well. I thought eventually the 32 bit client was going to be dropped anyhow at some point? It doesn't seem like they are overly concerned about removing support for extremely old systems to me, and I would hope that isn't a large priority to them (I bet if they survey the system stats for their playerbase, those Grandma's Windows XP systems are a very small amount).

I thought they stoped supporting the 32 bit client a year or so ago do you have any other info?

Link to comment
Share on other sites

@Illconceived Was Na.9781 said:

@Leablo.2651 said:That strikes me as mostly irrelevant. People stuck on single-core CPUs or DX9 video cards and yet still feel inclined to play GW2 at minimum settings would be a tiny minority of the player base and likely not contribute much of anything to Anet's revenue streams.You'd be surprised at how many people are still running XP.

The kind of engine technology being discussed here is so old that nearly every hardware profile presently in use, even machines long relegated to potato status, would stand to benefit from a software upgrade.Hardly anyone is saying there's no benefit. The question is: how much benefit and what will it cost? ANet (who is more likely to know than any of us) says limited benefit and major cost, which is usually results in a no-brainer business decision to put it off.

Guild Wars 2 doesn't officially support Windows XP (and Vista) anymore, nor single core CPUs. They changed the minimum system requirements with Path of Fire:

Windows 7 (64-bit only) or laterIntel Core 2 Quad 2.4 GHz, Core i3, AMD Athlon 64 X2NVIDIA GeForce 8800GTS, AMD Radeon HD 2900 XT4 GB RAM50 GB available HDD spaceBroadband Internet connectionKeyboard and mouse

Link to comment
Share on other sites

@Zaklex.6308 said:The other elephant in the room that everyone forgets is that ArenaNet wants their game to be able to run decently on potatoes...which it currently does, i.e.; I can throw open my 15 year old Toshiba laptop with Win7 and start up GW2 and play it without any problems tomorrow(if I wanted to, not that I would, but I could). That's the other reason for not doing any major changes to the engine, they want it to be able to run on as many computers as possible, even those old dinosaurs.

Keep in mind that the lowest requirement cards NVIDIA GeForce 8800GTS and AMD Radeon HD 2900 XT are both DirectX 10 cards released in 2006/2007. So basically the game doesn't officially support a DirectX 9 GPU anymore. Obviously the DX10 cards are also much faster in DX9 games than previous gen cards, but it does look weird to support only DX10+ cards while having a DX9 game.

Link to comment
Share on other sites

@"Towelie.9504" said:No doubt and I totally agree it isn't just a DX9 to DX11/12 update, it's engine re-writes as well though I think there would be clearly some benefits of going to an API with less overhead built in. Less time it takes for the CPU to send junk to the GPU means more scheduling time to begin on sending more junk to the GPU right? With that, performance gains.

It's no doubt there would have to be changes on the engine side as well to utilize DX11 (properly) but even the base engine should be able to utilize it, and by utilizing a less beefy API you're also letting that DX11/12 to utilize something than old legacy display drivers that have been around since Windows XP which just compounds the issue when using DX9.

I guess the thing that I am pointing at is the core issue moreso than the technical solution to the core issue: If it's an engine rewrite, updating the DX api, etc why aren't these being actively done? I would imagine the next expansion in October will not be the last, and the expansion after that may not be the last as well. How much longer does this problem need to exist before it actually has some inertia to begin to solve it? We see a game older than GW1 and GW2 do these updates and it doesn't even need to (WoW) and we see new games coming out with engines that outperform GW2 by orders of magnitude with only an alpha engine (Camelot Unchained). What are the actions needed to begin some improvement while this game is expected to still be around for another 4-5 years? We say this will take years, this game has been out for years. Why hold off this stuff when it's a problem that is facing numerous players every single day?

I would imagine any work done >now< would have benefits of being reused by a future major game like GW3. If that doesn't happen wouldn't we just be seeing GW3 re-using the same engine mostly from GW2 which was re-using the same engine mostly from GW1? I am assuming this is purely because re-writing the engine to be more optimal means more money needed for engine developers and the net benefit of doing this doesn't represent enough dollar signs?

I think failed to convey the how big of a refactoring project of changing engine architecture is....... Its essentially building a whole new graphics engine and pipe line process from scratch, having to port over the entire main thread from a largely sequential ordering/presumption model to a Thread safe model (which is a LOT of additional overhead in code behavior), reorganizing the data structures to support multi-access and protection flagging, and whatever else is needed to improve the memory management standards of the coders to avoid snafus.

Doing that to a live system, while maintaining backwards compatibility with existing code until it can be properly phased out, is borderline idiotic unless the environment absolutely demands it. Normally an engine redesign of that magnitude would be done on a new title, specifically to avoid having to support legacy code, and nightmare scenario that typically comes with it. If they were to start something like this, it would be for GW3. The only Reason WoW bothers to update its engine in a live environment like that, is because their current business model doesn't want to risk the fall out of trying to migrating its player base to "World of Warcraft 2". (And for the record, that is a whole reddit novel worth of explanation to put that into proper context.) And In case I wasn't clear about this, they're trying to extend the life span of that game, to AVOID making a sequel, even though they have just about every other logical reason to do so.

Another mistake you're making is a whole comparison of an Alpha engine across a 2 or more generation gap, while also ignoring what makes those engines different, their work loads, and the amount of Legacy the older engines are still carrying. InDev engines are also not good litmus tests for final products, since most games will lose performance as they move along Development, as they start being loaded down with the actual "game". I'm not being nitpicky about this either; its an apples and hand grenades comparison, when you're trying to use this as an example of what something else should be like.

To make a construction analogy, if you could recycle a building and do minor renovations, that is the cheaper, faster option. But if the building is just unsuitable for your needs, you have 2 options:

  1. Try and force it to fit your needs by retrofitting, while maintaining the existing infrastructure. A process thats increasingly expensive and delicate the more foundational changes are needed to meet your requirements. Its also less efficient, because there will always be something you have to keep due to it being inherent or crucial to the existing building; and thus best left, but unused, rather then risk tampering with it.
  2. Demo the building, and build a new one. Also expensive and time consuming, but often less so then the above, and far safer as long as you can afford the investment. It also allows you to correct the design flaws of the first building, or use an entirely new design all together.

So its not "upgrading" the graphics engine and main thread to be better..... its more like building a new Graphics pipeline and Main thread, and then porting GW2 into it.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...