Jump to content
  • Sign Up

Alright, maybe it is time for an engine upgrade.


Lawtider.9024

Recommended Posts

@Astralporing.1957 said:

@Wuffy.9732 said:all anet needs to do is rebuild the graphics library and port/load everything with Dx12.Needs to do towards what goal? Remember, according to what we've heard from Anet before, the main bottleneck is
outside
render thread.

That depends on the situation, as DX12py proves that such a change can have a very positive impact on performance. When the bottleneck is indeed outside the render thread then moving to DirectX 12 won't help, just like moving to DirectX 12 doesn't help World of Warcraft in various situations. However, giving up on performance improvements that can indeed help the average performance of the game, because they wouldn't help at the worst possible situations, is really puzzling.

Link to comment
Share on other sites

@Ashen.2907 said:

@"Astralporing.1957" said:

I have 5 years old hardware and overall
am
running it at 30+ fps at max details. Easily.

4 years old for me but otherwise, same.

Just out of curiousitly, you satisfied with that?Wouldn't it be better if you could run 60+ FPS easily?

BTW, i'm not making a point or anything, just curious if people are "fine" with such FPS or what.

Link to comment
Share on other sites

@Veprovina.4876 said:

@"Astralporing.1957" said:

I have 5 years old hardware and overall
am
running it at 30+ fps at max details. Easily.

4 years old for me but otherwise, same.

Just out of curiousitly, you satisfied with that?Wouldn't it be better if you could run 60+ FPS easily?

BTW, i'm not making a point or anything, just curious if people are "fine" with such FPS or what.By saying i am running the game at 30+ fps at max details
easily
i meant that i generally have no problem with graphics whatsoever.
Link to comment
Share on other sites

@maddoctor.2738 said:

@Wuffy.9732 said:all anet needs to do is rebuild the graphics library and port/load everything with Dx12.Needs to do towards what goal? Remember, according to what we've heard from Anet before, the main bottleneck is
outside
render thread.

That depends on the situation, as DX12py proves that such a change can have a very positive impact on performance.I don't think that 5 fps more on low end and ~10 fps more on high end (in places where it even helps, because it doesn't help everywhere) can be called a very positive impact. Those improvements are too small to notice with naked eye, you can only see them by using some fps meter and seeing actual numbers. Sure, it is better than without it, but i wouldn't call it a major improvement. Now, doing something about the main thread or server-client communication
would
help a lot.
Link to comment
Share on other sites

@Astralporing.1957 said:

I have 5 years old hardware and overall
am
running it at 30+ fps at max details. Easily.

4 years old for me but otherwise, same.

Just out of curiousitly, you satisfied with that?Wouldn't it be better if you could run 60+ FPS easily?

BTW, i'm not making a point or anything, just curious if people are "fine" with such FPS or what.By saying i am running the game at 30+ fps at max details
easily
i meant that i generally have no problem with graphics whatsoever.

Ah you mean, 30 FPS is the minimum it drops? Or it's like stable at 30 FPS?

Link to comment
Share on other sites

@Veprovina.4876 said:

@"Astralporing.1957" said:

I have 5 years old hardware and overall
am
running it at 30+ fps at max details. Easily.

4 years old for me but otherwise, same.

Just out of curiousitly, you satisfied with that?Wouldn't it be better if you could run 60+ FPS easily?

BTW, i'm not making a point or anything, just curious if people are "fine" with such FPS or what.

30+, works for me. I guess the question I would ask in return is, how much money would you spend to add 10, 20, or 30 FPS in GW2?

edit: and 30fps is a minimum that I see.

Link to comment
Share on other sites

@Ashen.2907 said:

@"Astralporing.1957" said:

I have 5 years old hardware and overall
am
running it at 30+ fps at max details. Easily.

4 years old for me but otherwise, same.

Just out of curiousitly, you satisfied with that?Wouldn't it be better if you could run 60+ FPS easily?

BTW, i'm not making a point or anything, just curious if people are "fine" with such FPS or what.

30+, works for me. I guess the question I would ask in return is, how much money would you spend to add 10, 20, or 30 FPS in GW2?

edit: and 30fps is a minimum that I see.

I see, thanks for responding.

I don't think the question is how much would "i" spend, but rather what Anet can do for all players.Because, according to this thread, no matter how much i spend, it'll never be stable on any hardware.

I'm also kinda ok with 30-ish FPS, but that doesn't mean i wouldn't like the game to be more optimised so i can get more.And for a while i did, they did some optimisation upgrades and WvW was running "smoothly" at medium at around 25 FPS. Usually it runs 15FPS or lower for me on all low.

So if optimisations would bring the FPS to 30 but stable, that means other players with better hardware would start seeing 60+ FPS stable without dips.

I think that's a good goal for Anet to work towards for future proofing the game.But anyway, i'm rambling, i got my answer, thanks for responding. :smile:Cheers!

Link to comment
Share on other sites

@Blocki.4931 said:Literally just don't try to run it at max settings.That's all that is required. It's perfectly playable, it works.

Man its almost 2021, everyone should be able to run this game at max settings with no issues. The game is from 2012, the engine is older. There is no excuses. Anet really needs to do something about it.

Link to comment
Share on other sites

ArenaNet should follow the path other successful MMO's have taken and have two executable files, one that runs the old DirectX version and one that uses DirectX 11 or 12.

Making DirectX 9, which is a ~20-year-old technology, the ONLY choice, has already started turning off a lot of would-be customers.

Right now MMO's running DirectX 11 or 12 are seeing 3x the FPS of GW2 on the same hardware.

Link to comment
Share on other sites

@Blocki.4931 said:Literally just don't try to run it at max settings.That's all that is required. It's perfectly playable, it works.If you use the d3d12 wrapper, you can easily max out almost every setting if your rig is decent enough (read: not a 2011 potato). Any 4 core with 3+ ghz should do.

Player count and higher shadow settings are a very odd exception. These are usuable as long as you're not in a map meta, or smack in the middle of a holiday event.

Link to comment
Share on other sites

@XenoSpyro.1780 said:

@Blocki.4931 said:Literally just don't try to run it at max settings.That's all that is required. It's perfectly playable, it works.If you use the d3d12 wrapper, you can easily max out almost every setting if your rig is decent enough (read: not a 2011 potato). Any 4 core with 3+ ghz should do.

Player count and higher shadow settings are a very odd exception. These are usuable as long as you're not in a map meta, or smack in the middle of a holiday event.Those are basicly the only settings that make any difference normally, without the wrapper. And its hardly surprising. Take the average bling a player wear and do it 50-100x. Even if each additional player only reduced fps by 1, well... And regarding shadows, I dont even know a single game that does dynamic shadowing without taking fps hits regardless of it being a DX9 title in 2010 or a DX12 title in 2020. I run that on blob (maintains character depth over terrain while having negligable fps impact).

I remember toying with model limit during the dragonfall event with the cinematic flyby over the end cliff where all the players teleport. Having it high was choking the fps to death. Reducing to medium made it smooth without any particular visual difference.

Link to comment
Share on other sites

@XenoSpyro.1780 said:

@"Blocki.4931" said:Literally just don't try to run it at max settings.That's all that is required. It's perfectly playable, it works.If you use the d3d12 wrapper, you can easily max out almost every setting if your rig is decent enough (read: not a 2011 potato). Any 4 core with 3+ ghz should do.

Player count and higher shadow settings are a very odd exception. These are usuable as long as you're not in a map meta, or smack in the middle of a holiday event.

Even at its inception the d912pxy benchmarks suggest that some things are still heavily biased towards CPU.

1080ti + Ryzen 7 2700x (2 generations old)
  • Svanir Shaman core map meta boss: 21.5-32.2 min (d912pxy), average 31.1-49.3 (d912pxy), 1% low 21.6-29.8 (d912pxy)
  • Fire elemental core map meta boss: 27.3-31.5 min (d912pxy), average 30.1-35 (d912pxy), 1% low 22.9-25.1 (d912pxy)
  • Lion's Arch FPS with dx9 and d912pxy : 29.5-31.2 min (D912pxy) , average 52.8-60.2 , 1% low 26.6-27.9 (d912pxy)
  • Urban Battleground fractal ,5 players limit : 37-42.2 min (d912pxy) , average 63 FPS, 1% low 34.6-38.8 (d912pxy)
  • Volcanic fractal : 56.6-60.2 min (d912pxy) , average 81.9-110 (d912pxy), 1% low 56.9-63.1 (d912pxy)

This user was in WvW recording on EBG and seems to hit ~26FPS or so as a minimum with the DX12 mod and Windows optimizations

i7-8700K ( 6 core 12 thread) @ 5.2GHz , RX5700XTNote that for AMD GPUs , dxvk has proven to be more reliable

Also note that they have model limit medium, which is generally a bad idea for WVW.

Link to comment
Share on other sites

Is it possible that the framerate drops when there's lot of players on the screen at the same time, just because that would save lots of traffic? Thinking at how much bandwidth would be required if it was trying to deliver 60+ fps to all 80 players?Or does it work like that at all?

Also thinking if new hardware is build to use dx12, when an older driver or dx is used that wouldn't know how to use the full power of the hardware?

Just throwing out some thoughts, how crazy (and maybe very wrong) they may sound.

Link to comment
Share on other sites

@Lucio.4190 said:Is it possible that the framerate drops when there's lot of players on the screen at the same time, just because that would save lots of traffic? Thinking at how much bandwidth would be required if it was trying to deliver 60+ fps to all 80 players?Or does it work like that at all?

Also thinking if new hardware is build to use dx12, when an older driver or dx is used that wouldn't know how to use the full power of the hardware?

Just throwing out some thoughts, how crazy (and maybe very wrong) they may sound.

There's definitely something going on on their end, server side. When they started optimising something, whatever it was they did - made the game run smoother for me. It also affected lag. And i noticed some correlation between FPS drops and lag as well, but it could have been just in my head honestly. No idea.

Link to comment
Share on other sites

@"Lucio.4190" said:Is it possible that the framerate drops when there's lot of players on the screen at the same time, just because that would save lots of traffic? Thinking at how much bandwidth would be required if it was trying to deliver 60+ fps to all 80 players?Or does it work like that at all?

Also thinking if new hardware is build to use dx12, when an older driver or dx is used that wouldn't know how to use the full power of the hardware?

Just throwing out some thoughts, how crazy (and maybe very wrong) they may sound.

Network lag causes rubber-banding (if your settings menu shows spikes 250+ms which is quarter of a second, on par with a fast weapon skill). Network connectivity is just part of it and I would assume the code is tuned to update in intervals (a server-based "tick" , for Overwatch or CS:GO it's 60 ticks/second , Fortnite is 30 ticks/second , Apex Legends is 20 ticks/second) such that any positioning data in between is interpolated. Common network code will dictate which objects are in draw distance. For 60FPS, it is 16ms per frame and for 30FPS it is 32ms per frame not accounting for input lag.

The only GPUs with outright DX9 issues were Navi-based AMD Radeons, people were using DXVK and d912pxy to get around that. Everything from "Kepler" GTX 600 series and up should support DX12 even if it isn't the full featureset.

If you have a desktop Intel i5 quad core or better from 2012 onwards (all AMD Ryzens included) frame rates should be at least double digits , typically above mid 20s on unoptimized world bosses and/or 50v50 WvW on EBG and 40+ minimums elsewhere.

If you don't turn on standard enemy models in WvW then you are likely to run into "asset load" setbacks such as having to get data on appearance (including race, gender , backpacks) , class , weapon sets equipped.

Link to comment
Share on other sites

@"Lucio.4190" said:Is it possible that the framerate drops when there's lot of players on the screen at the same time, just because that would save lots of traffic? Thinking at how much bandwidth would be required if it was trying to deliver 60+ fps to all 80 players?Or does it work like that at all?

Also thinking if new hardware is build to use dx12, when an older driver or dx is used that wouldn't know how to use the full power of the hardware?

Just throwing out some thoughts, how crazy (and maybe very wrong) they may sound.Yes, but most definetly no. Bandwidth has little do with it compared to actually rendering it.

Now, I'm still a noob graphics programmer (I mean I made an OpenGL clock once that accidentally ran backwards :() but here is the thing - each of those 80 players you see can wear anything. You got 6 different armor slots the game has to put together and on top of that any other character effects and custom looks. Then, once you have all that which will bring down fps quite alot... you still have a metric ton of dynamic effects to handle. Because each of those 80 players can do anything on screen. 80 players in combat with each other will be much, much heavier to render than 80 people standing picking their navels. Thats why boss fights are a mess.

"Normal" games usually get around the first part because modern graphics cards are very good at instancing geometry, ie duplicating it. You can have a singleplayer game with 100 enemies on screen that is "fast" relatively speaking... because its only 3 or so unique models. Same way DICE showcased thousands of high polygon soldiers running on screen long ago in a tech demo... they where identical.

That said, interestingly enough there is something weird going on in the graphics pipeline, seen that in the past (I dont do so much PvE now). Boss fights could chug along at 25 fps and then suddenly, out of the blue with little change in visible player activity, it went to smoother 50+... only to drop down again like 5s later as the fight continue.

Link to comment
Share on other sites

Thanks for the explanation everyone. That makes sense.I also thought that all that graphics information is sent in 720p or 1080p with all the bling over (hopefully) 100+ Mbps to all other players, synchronized with the ANet server that will keep the communication with everyone.

Is fps always related to the ms? I might have got this wrong, but I thought the ms is how fast the server responds to your computer and that you can still have a slow connection (or an overloaded network)?

What if some of the players have a graphics adapter or an internet connection that can't handle all the information, but still participates in the fight? That computer still needs to synchronize with everyone?Was thinking about old times, when an under dimensioned computer could cause a lag for everyone. As if the worst connection decides the quality for everyone?Is that a possibility?

Link to comment
Share on other sites

@Lucio.4190 said:Thanks for the explanation everyone. That makes sense.I also thought that all that graphics information is sent in 720p or 1080p with all the bling over (hopefully) 100+ Mbps to all other players, synchronized with the ANet server that will keep the communication with everyone.

Is fps always related to the ms? I might have got this wrong, but I thought the ms is how fast the server responds to your computer and that you can still have a slow connection (or an overloaded network)?

What if some of the players have a graphics adapter or an internet connection that can't handle all the information, but still participates in the fight? That computer still needs to synchronize with everyone?Was thinking about old times, when an under dimensioned computer could cause a lag for everyone. As if the worst connection decides the quality for everyone?Is that a possibility?That hasn’t been an issue since the very early 90s.

Link to comment
Share on other sites

@phokus.8934 said:

@Lucio.4190 said:Thanks for the explanation everyone. That makes sense.I also thought that all that graphics information is sent in 720p or 1080p with all the bling over (hopefully) 100+ Mbps to all other players, synchronized with the ANet server that will keep the communication with everyone.

Is fps always related to the ms? I might have got this wrong, but I thought the ms is how fast the server responds to your computer and that you can still have a slow connection (or an overloaded network)?

What if some of the players have a graphics adapter or an internet connection that can't handle all the information, but still participates in the fight? That computer still needs to synchronize with everyone?Was thinking about old times, when an under dimensioned computer could cause a lag for everyone. As if the worst connection decides the quality for everyone?Is that a possibility?That hasn’t been an issue since the very early 90s.

IPX/SPX and Token Ring... good times. ?Not sure the early 90s had a real MMO. Believe it was Battle.net where online games started to grow and Everquest was late 90s, but I shouldn't go into details. ?

I tried to find a new angle to the problem, see if there's more to it than just the engine. But I do agree that the engine will need optimizing, no matter what.I don't know how graphics information is sent over the network, how it is compressed and shared with all clients.

Thinking about how @Dawdler.8521 explained it with how information is usually duplicated to save resources and in this case, all players have their own unique gear. That is a lot of information that can't be duplicated and every player acts with its own pattern.This is where it gets complicated to me, how the graphics adapter handles all the unique information and how it's used to minimize the load on the network. I wonder if there's a difference from how video streaming works, when the picture drops quality to keep the flow?Or maybe that doesn't matter as long as the connection is more than 2 Mbps?

Link to comment
Share on other sites

@"Lucio.4190" said:Thanks for the explanation everyone. That makes sense.I also thought that all that graphics information is sent in 720p or 1080p with all the bling over (hopefully) 100+ Mbps to all other players, synchronized with the ANet server that will keep the communication with everyone.

Is fps always related to the ms? I might have got this wrong, but I thought the ms is how fast the server responds to your computer and that you can still have a slow connection (or an overloaded network)?

What if some of the players have a graphics adapter or an internet connection that can't handle all the information, but still participates in the fight? That computer still needs to synchronize with everyone?Was thinking about old times, when an under dimensioned computer could cause a lag for everyone. As if the worst connection decides the quality for everyone?Is that a possibility?The answer was yes to the amount of players visually rendered, no to it having to do with the network. You keep mixing the two.

In very basic terms the network only sends simple things, such that a player has helmet with id 233 with color id 82 and its set to visible. The client then has to actually render that which is complicated math, graphics card pipeline geometry mumbo jumbo magic and whatnot. There is a reason why DX9 is a programming layer on top of the GPU code. You wouldnt exactly want to code apps in assembler either, we got C++ for that.

720P vs 1080P has to do with how many pixel need to be rendered (ie the size of the framebuffer) and is nearly irrelevant unless you go 4-8K. Vram on the card is important for that. We havent been limited by it for a looong time, no 16mb voodoo cards anymore lol. It has nothing to do with the network, you'd need to send exactly as much data on players regardless of resolution.

Fps have nothing to do with ms (unless its ms per frame, but I assume you meant ping).

Regarding the "under dimensioned computer", thats lag alone. If your "fast" client isnt getting data from the server fast enough due to another client not sending data fast enough, you get network lag not low fps. The server would probably prioritize the critical datasets (like location, player skills, etc). GW2 has ways around it like using standard models and client side prediction.

Either way, just musing randomly. Only Anet knows the details of the engine.

Link to comment
Share on other sites

@Dawdler.8521 said:

@Lucio.4190 said:Thanks for the explanation everyone. That makes sense.I also thought that all that graphics information is sent in 720p or 1080p with all the bling over (hopefully) 100+ Mbps to all other players, synchronized with the ANet server that will keep the communication with everyone.

Is fps always related to the ms? I might have got this wrong, but I thought the ms is how fast the server responds to your computer and that you can still have a slow connection (or an overloaded network)?

What if some of the players have a graphics adapter or an internet connection that can't handle all the information, but still participates in the fight? That computer still needs to synchronize with everyone?Was thinking about old times, when an under dimensioned computer could cause a lag for everyone. As if the worst connection decides the quality for everyone?Is that a possibility?The answer was yes to the amount of players visually rendered, no to it having to do with the network. You keep mixing the two.

In very basic terms the network only sends simple things, such that a player has helmet with id 233 with color id 82 and its set to visible. The client then has to actually render that which is complicated math, graphics card pipeline geometry mumbo jumbo magic and whatnot. There is a reason why DX9 is a programming layer on top of the GPU code. You wouldnt exactly want to code apps in assembler either, we got C++ for that.

720P vs 1080P has to do with how many pixel need to be rendered (ie the size of the framebuffer) and is nearly irrelevant unless you go 4-8K. Vram on the card is important for that. We havent been limited by it for a looong time, no 16mb voodoo cards anymore lol. It has nothing to do with the network, you'd need to send exactly as much data on players regardless of resolution.

Fps have nothing to do with ms (unless its ms per frame, but I assume you meant ping).

Thanks again for the explanation and your patience to explain it. I'm starting to get a better understanding of it now.Sorry for mixing things up. ?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...