@Brimstone Jack.3462 said:
1. My apologies. Legitimately not my intention to misgender anyone.
2. If your system runs the game perfectly, why are you looking for performance-boosting adds?
I wasn't. I was mainly curious. It won't run with my current add-ons anyway (one being GW2 Hook).
In crowded places and places that load new assets, there's a HUGE difference.
Ah, okay. How much of a FPS boost are we talking about, approximately? (I am currently getting between 30 and 60 FPS during crowded meta events, above 100 otherwise.)
Well, perhaps the GW2 Hook creator will eventually update their DLL or the D912pxy creator will implement ReShade features.
@Mack.3045 said:
For most users the d912pxy works out of the box after install and some config file tweaks to suit hardware. This would be a 10-15 min initial investment in time.
Good to know, thanks.
Gains depend on individual hardware set-ups.
For me in some areas I net an extra 30-50+ fps, better visuals and a better experience overall.
That's quit a lot! Now I wish it worked with GW2 Hook. Not that I'm ever having slide shows with my current hardware, but the more FPS the better!
Am I getting this right? This tool does not permanently download the textures/objects but anew everytime I start the game?
And why aren't textures of my characters in the Character Selection Screen loaded from the beginning? I have to enter a map, then return to the CSS in order to see my characters loaded properly.
@Ashantara.8731 said:
Am I getting this right? This tool does not permanently download the textures/objects but anew everytime I start the game?
And why aren't textures of my characters in the Character Selection Screen loaded from the beginning? I have to enter a map, then return to the CSS in order to see my characters loaded properly.
You need to edit your config
Read pso cache =1
Save pso cache =1
Over a few games the shader cache will be built and you won't get the pop ins 😉
@Mack.3045 said:
Over a few games the shader cache will be built and you won't get the pop ins 😉
I see, thanks. Well, I went back to Gw2hook as I cannot play this game in its original blurry state. Why there are no graphical optimization options within the game itself is beyond me.
P.S. If I used the latest ReShade instead of Gw2hook, would those be compatible with each other?
@Mack.3045 said:
Over a few games the shader cache will be built and you won't get the pop ins 😉
I see, thanks. Well, I went back to Gw2hook as I cannot play this game in its original blurry state. Why there are no graphical optimization options within the game itself is beyond me.
P.S. If I used the latest ReShade instead of Gw2hook, would those be compatible with each other?
Yes I recommend you use Gshade. Just remember to select DX12 as the api option during the installation process. You may need to copy the dxgi into the gw2 64 bin folder for it to work https://gposers.com/gshade/
I have a question, since d912pxy creates new shaders for a lot of materials (or all?), wouldn't it be possible to "delete" a shader or two, or replace them with "blanks" so we can have a more particle free game? For example, remove the shader that applies the "storm" effect when you are in places like Thunderhead Keep, or the Frozen Maw and others. Those little snowflakes kill performance and since Anet doesn't want to help the game's performance, perhaps the developer of d912pxy can. If we replace the created shader with an empty one (one that does nothing) it should remove the effect from the screen and provide a massive performance boost.
@maddoctor.2738 said:
I have a question, since d912pxy creates new shaders for a lot of materials (or all?), wouldn't it be possible to "delete" a shader or two, or replace them with "blanks" so we can have a more particle free game? For example, remove the shader that applies the "storm" effect when you are in places like Thunderhead Keep, or the Frozen Maw and others. Those little snowflakes kill performance and since Anet doesn't want to help the game's performance, perhaps the developer of d912pxy can. If we replace the created shader with an empty one (one that does nothing) it should remove the effect from the screen and provide a massive performance boost.
Hi good question. No unfortunately not as all of those effects form a basis of the games native code. Megai2 could give you a more succinct answer than me in technical terms if you jump on discord and chat with him.
Can anyone help me find the UImask.fx for ReShade and explain to me how to use it properly for GW2, since GW2 Hook is completely outdated and causes graphical issues with fog and occasional game crashes with the latest GPU drivers?
@Ashantara.8731 said: Can anyone help me find the UImask.fx for ReShade and explain to me how to use it properly for GW2, since GW2 Hook is completely outdated and causes graphical issues with fog and occasional game crashes with the latest GPU drivers?
Thanks in advance.
I'd jump on the reshade discord channel and check there
@Mack.3045 said:
I'd jump on the reshade discord channel and check there
So you are telling me that everyone on here who is using ReShade is doing so without a "Skip UI" option? The text labels are so blurry, it's unbearable to the eye.
@Mack.3045 said:
I'd jump on the reshade discord channel and check there
So you are telling me that everyone on here who is using ReShade is doing so without a "Skip UI" option? The text labels are so blurry, it's unbearable to the eye.
Can't speak for anyone else but i'm not using it. Text is fine for me on my 1440p monitor. I use DPX, Adaptive Sharpen and Mxao via Gshade & 80% Amd image sharpening filter
Now the text is crisp, but the windows and icons are way too sharp and the chat's text colors aren't 100% authentic (as they are affected by my HDR and Bloom settings).
How do I properly use the "Skip UI" effect (UImask.fx) for GShade in GW2? Where do I enter the mask's image name so it will be loaded by the UImask.fx?
@Mack.3045 said:
There is a way to remove bloom using a custom hsls shader with the d912pxy.
Or you can deactivate the in-game Post-Processing and use GShade's blooming HDR shader instead (which gives you an option to adjust the bloom intensity). The only issue with that is - as mentioned before - that it heavily affects the UI and also loading screens.
@Mack.3045 said:
I checked and unfortunately there is no way to Skip UI at this point.
So there is no way to make use of the existing UImask.fx shader using a custom made image for GW2, because there is nowhere to link to said image within GShade? (I wonder how the creator of GW2hook did it...)
@Mack.3045 said:
There is a way to remove bloom using a custom hsls shader with the d912pxy.
Or you can deactivate the in-game Post-Processing and use GShade's blooming HDR shader instead (which gives you an option to adjust the bloom intensity). The only issue with that is - as mentioned before - that it heavily affects the UI and also loading screens.
@Mack.3045 said:
I checked and unfortunately there is no way to Skip UI at this point.
So there is no way to make use of the existing UImask.fx shader using a custom made image for GW2, because there is nowhere to link to said image within GShade? (I wonder how the creator of GW2hook did it...)
I'd discuss it with the experts on the Gshade discord channel - they are very helpful and they might have a solution !
@Mack.3045 said:
I'd discuss it with the experts on the Gshade discord channel - they are very helpful and they might have a solution !
You mean just directly asking one of the people online? Because last time I looked, they don't have a Discord channel where you can drop questions, nor do they have a forum.
@Mack.3045 said:
I'd discuss it with the experts on the Gshade discord channel - they are very helpful and they might have a solution !
You mean just directly asking one of the people online? Because last time I looked, they don't have a Discord channel where you can drop questions, nor do they have a forum.
You seem resourceful - i'm sure you'll work it out. There's also the GW2 development community channel. There is a reshade group there as well
If you join the d912pxy discord channel you'll see the reshade listed in the # PROJECTS
(Nevermind, I figured out the cause for the glitch I mentioned in this post. It was the remain of a shader I was no longer using - it somehow still appeared in the .ini, oops!)
@Ashantara.8731 said:
(Nevermind, I figured out the cause for the glitch I mentioned in this post. It was the remain of a shader I was no longer using - it somehow still appeared in the .ini, oops!)
Using Windows 7. When I update to the newest version of d912pxy (Release v2.3.1), and activate it, I get the following error message on game start:
---------------------------
Guild Wars 2: Gw2-64.exe - System Error
---------------------------
The program can't start because api-ms-win-core-libraryloader-l1-2-0.dll is missing from your computer. Try reinstalling the program to fix this problem.
---------------------------
OK
---------------------------
@TheQuickFox.3826 said:
Using Windows 7. When I update to the newest version of d912pxy (Release v2.3.1), and activate it, I get the following error message on game start:
---------------------------
Guild Wars 2: Gw2-64.exe - System Error
---------------------------
The program can't start because api-ms-win-core-libraryloader-l1-2-0.dll is missing from your computer. Try reinstalling the program to fix this problem.
---------------------------
OK
---------------------------
The previous Version v2.3 works fine. Any clues?
I would log the issue on GITHUB or have a chat with Megai on the d912pxy discord channel. He'll know how to fix it
Hi guys. Not sure were I should drop my question so I try here (were else?).
I'm running D912PXY + some other addons trough GW2 Addon Manager, as well as GShade. All this works perfectly.
My problem is that I'm unable to game capture or window capture GW2.exe with OBS Studio. Capture screen is blank, then sometimes OBS crashes. The only way is screen capture, wich is annoying. Can anyone successfully capture GW2.exe with d912pxy + GShade trough OBS ?
@Quadeard.8457 said:
Hi guys. Not sure were I should drop my question so I try here (were else?).
I'm running D912PXY + some other addons trough GW2 Addon Manager, as well as GShade. All this works perfectly.
My problem is that I'm unable to game capture or window capture GW2.exe with OBS Studio. Capture screen is blank, then sometimes OBS crashes. The only way is screen capture, wich is annoying. Can anyone successfully capture GW2.exe with d912pxy + GShade trough OBS ?
Hi, that would be an issue with OBS specifically and how it hooks into what's being rendered. I'd jump on the d912pxy discord channel and ask there. Otherwise ask the guys who author the OBS software.
Good luck!
@Quadeard.8457 said:
Hi guys. Not sure were I should drop my question so I try here (were else?).
I'm running D912PXY + some other addons trough GW2 Addon Manager, as well as GShade. All this works perfectly.
My problem is that I'm unable to game capture or window capture GW2.exe with OBS Studio. Capture screen is blank, then sometimes OBS crashes. The only way is screen capture, wich is annoying. Can anyone successfully capture GW2.exe with d912pxy + GShade trough OBS ?
OBS has several conflict issues. The one you are experiencing might just be another one.
@Mack.3045 said:
I'd jump on the reshade discord channel and check there
So you are telling me that everyone on here who is using ReShade is doing so without a "Skip UI" option? The text labels are so blurry, it's unbearable to the eye.
This is a bit late but I made my own UIMask .PNG file and replaced the blank one inside the "reshade-shader/textures" folder. You can use a program like kitten (GNU Image Manipulation Program, these forums block the acronym ) and import a screenshot from the game, and then paint over the areas that you use for your static UI elements in white, and the rest of the image in black (I may have the colours reversed). It was tricky but with some tutorials using GnuIMP I found online I got it done.
Then while in game you set "UIMask-Top" at the top of the active effects you don't want to affect your UI, and UIMask-Bottom after said effects.
@Quadeard.8457 said:
Hi guys. Not sure were I should drop my question so I try here (were else?).
I'm running D912PXY + some other addons trough GW2 Addon Manager, as well as GShade. All this works perfectly.
My problem is that I'm unable to game capture or window capture GW2.exe with OBS Studio. Capture screen is blank, then sometimes OBS crashes. The only way is screen capture, wich is annoying. Can anyone successfully capture GW2.exe with d912pxy + GShade trough OBS ?
Hello guys. I solved my problem !
OBS uses BitBlt capture method for the game capture or default window capture. This method doesn't handle hardware accelerated app. The same happens when you try to capture Chromium or Discord for exemple.
You have to capture the game with window capture and select Windows Graphics Capture (WGC) method. The downside is that WGC adds a yellow border to your game window (not to the stream though) and always records the cursor position. This is by design by Windows for security reasons.
So i had this installed before but it was an old version and I want to update to the newest one. However the zip file I download doesn't include an installer. How do I install the latest release? Windows defender didn't remove it either. It just isn't included.
@Polish Hammer.6820 said:
So i had this installed before but it was an old version and I want to update to the newest one. However the zip file I download doesn't include an installer. How do I install the latest release? Windows defender didn't remove it either. It just isn't included.
Hello, i just downloaded a fresh zip of 2.4 and unpacked. The installer file is there. Not sure what is happening on your end !
@Polish Hammer.6820 said:
So i had this installed before but it was an old version and I want to update to the newest one. However the zip file I download doesn't include an installer. How do I install the latest release? Windows defender didn't remove it either. It just isn't included.
Hello, i just downloaded a fresh zip of 2.4 and unpacked. The installer file is there. Not sure what is happening on your end !
I don't see install.exe file in new 2.4 build as well even before unpacking it (so it's for sure not deleted by defender):
Edit: NVM - By mistake, I downloaded source code instead of release...
Hi all. For anyone wanting to you reshade with the latest build d912pxy 2.4.1 build without issues, I recommend using the latest Gshade available here.
so the client hooks up to DX9 which hooks up to an Interface that hooks up to DX12 libraries and from there, you do the reverse just for the client to display graphics?
DX9 is backwards compatible and it is assumed that old software runs fast on new software, right?
Your Math Tèacher [MATH]
Digital Headhuntaz [aBrA]
@alcopaul.2156 said:
so the client hooks up to DX9 which hooks up to an Interface that hooks up to DX12 libraries and from there, you do the reverse just for the client to display graphics?
DX9 is backwards compatible and it is assumed that old software runs fast on new software, right?
Hi
Good question!
I'll give a in-depth technical explanation to answer your question and explain the mechanisms used to go from dx9>translation layer>dx12 render pipeline by using the d912pxy with GW2.
Dx912pxy and the D3D12 transitional layer.
What does this do?
This translation layer provides the following high-level constructs or components (and more) for the GW2 engine to implement in the rendering pipeline.
Resource binding
The D3D12 resource binding model is quite different from D3D9 and prior. Rather than having a flat array of resources set on the pipeline which map 1:1 with shader registers, D3D12 takes a more flexible approach which is also closer to modern hardware. The translation layer takes care of figuring out which registers a shader needs, managing root signatures, populating descriptor heaps/tables, and setting up null descriptors for unbound resources.
Resource renaming
D3D9 and older have a concept of DISCARD CPU access patterns, where the CPU populates a resource, instructs the GPU to read from it, and then immediately populates new contents without waiting for the GPU to read the old ones. This pattern is typically implemented via a pattern called "renaming", where new memory is allocated during the DISCARD operation, and all future references to that resource in the API will point to the new memory rather than the old. The translation layer provides a separation of a resource from its "identity," which enables cheap swapping of the underlying memory of a resource for that of another one without having to recreate views or rebind them. It also provides easy access to rename operations (allocate new memory with the same properties as the current, and swap their identities).
Resource sub-allocation, pooling, and deferred destruction
D3D9-style apps can destroy objects immediately after instructing the GPU to do something with them. D3D12 requires applications to hold on to memory and GPU objects until the GPU has finished accessing them. Additionally, D3D9 apps suffer no penalty from allocating small resources (e.g. 16-byte buffers), where D3D12 apps must recognize that such small allocations are infeasible and should be sub-allocated from larger resources. Furthermore, constantly creating and destroying resources is a common pattern in D3D9, but in D3D12 this can quickly become expensive. The translation layer handles all of these abstractions seamlessly.
Batching and threading
Since D3D9 patterns generally require applications to record all graphics commands on a single thread, there are often other CPU cores that are idle. To improve utilization, the translation layer provides a batching layer which can sit on top of the immediate context, moving the majority of work to a second thread so it can be parallelized. It also provides threadpool-based helpers for offloading PSO compilation to worker threads (d912pxy 2.4.1 uses a configurable PSO cache amount plus native DX12 caching at the driver level ) Combining these means that compilations can be kicked off at draw-time on the application thread, and only the batching thread needs to wait for them to be completed. Meanwhile, other PSO compilations are starting or completing, minimizing the wall clock time spent compiling shaders.
Residency management (memory management)
This layer incorporates the open-source residency management library to improve utilization on low-memory systems.
The other component here to consider is user hardware.
If someone is already CPU/GPU bound in DX9 natively with GW2 then i wouldn't expect the D912pxy to help.
The end-user also needs a GPU that supports/ is capable of rendering with the DX12 api.
If there's CPU/RAM/GPU/VRAM resources available/untapped, then definitely expect an uplift in performance. Ultimately, try it for yourself and see
@alcopaul.2156 said:
so the client hooks up to DX9 which hooks up to an Interface that hooks up to DX12 libraries and from there, you do the reverse just for the client to display graphics?
DX9 is backwards compatible and it is assumed that old software runs fast on new software, right?
Hi
Good question!
I'll give a in-depth technical explanation to answer your question and explain the mechanisms used to go from dx9>translation layer>dx12 render pipeline by using the d912pxy with GW2.
Dx912pxy and the D3D12 transitional layer.
What does this do?
This translation layer provides the following high-level constructs or components (and more) for the GW2 engine to implement in the rendering pipeline.
Resource binding
The D3D12 resource binding model is quite different from D3D9 and prior. Rather than having a flat array of resources set on the pipeline which map 1:1 with shader registers, D3D12 takes a more flexible approach which is also closer to modern hardware. The translation layer takes care of figuring out which registers a shader needs, managing root signatures, populating descriptor heaps/tables, and setting up null descriptors for unbound resources.
Resource renaming
D3D9 and older have a concept of DISCARD CPU access patterns, where the CPU populates a resource, instructs the GPU to read from it, and then immediately populates new contents without waiting for the GPU to read the old ones. This pattern is typically implemented via a pattern called "renaming", where new memory is allocated during the DISCARD operation, and all future references to that resource in the API will point to the new memory rather than the old. The translation layer provides a separation of a resource from its "identity," which enables cheap swapping of the underlying memory of a resource for that of another one without having to recreate views or rebind them. It also provides easy access to rename operations (allocate new memory with the same properties as the current, and swap their identities).
Resource sub-allocation, pooling, and deferred destruction
D3D9-style apps can destroy objects immediately after instructing the GPU to do something with them. D3D12 requires applications to hold on to memory and GPU objects until the GPU has finished accessing them. Additionally, D3D9 apps suffer no penalty from allocating small resources (e.g. 16-byte buffers), where D3D12 apps must recognize that such small allocations are infeasible and should be sub-allocated from larger resources. Furthermore, constantly creating and destroying resources is a common pattern in D3D9, but in D3D12 this can quickly become expensive. The translation layer handles all of these abstractions seamlessly.
Batching and threading
Since D3D9 patterns generally require applications to record all graphics commands on a single thread, there are often other CPU cores that are idle. To improve utilization, the translation layer provides a batching layer which can sit on top of the immediate context, moving the majority of work to a second thread so it can be parallelized. It also provides threadpool-based helpers for offloading PSO compilation to worker threads (d912pxy 2.4.1 uses a configurable PSO cache amount plus native DX12 caching at the driver level ) Combining these means that compilations can be kicked off at draw-time on the application thread, and only the batching thread needs to wait for them to be completed. Meanwhile, other PSO compilations are starting or completing, minimizing the wall clock time spent compiling shaders.
Residency management (memory management)
This layer incorporates the open-source residency management library to improve utilization on low-memory systems.
The other component here to consider is user hardware.
If someone is already CPU/GPU bound in DX9 natively with GW2 then i wouldn't expect the D912pxy to help.
The end-user also needs a GPU that supports/ is capable of rendering with the DX12 api.
If there's CPU/RAM/GPU/VRAM resources available/untapped, then definitely expect an uplift in performance. Ultimately, try it for yourself and see
I like the part that it goes multicore when rendering.
and by your definition, there's lots of going on with the DX12 and your interface/translation layer says it handles them all meticulously.
but say if the client goes DX9 -> interface/translation layer -> DX12, will the graphics rendering be from DX12 to the interface/translation layer and finally to the GW2 client?
Your Math Tèacher [MATH]
Digital Headhuntaz [aBrA]
@alcopaul.2156 said:
so the client hooks up to DX9 which hooks up to an Interface that hooks up to DX12 libraries and from there, you do the reverse just for the client to display graphics?
DX9 is backwards compatible and it is assumed that old software runs fast on new software, right?
Hi
Good question!
I'll give a in-depth technical explanation to answer your question and explain the mechanisms used to go from dx9>translation layer>dx12 render pipeline by using the d912pxy with GW2.
Dx912pxy and the D3D12 transitional layer.
What does this do?
This translation layer provides the following high-level constructs or components (and more) for the GW2 engine to implement in the rendering pipeline.
Resource binding
The D3D12 resource binding model is quite different from D3D9 and prior. Rather than having a flat array of resources set on the pipeline which map 1:1 with shader registers, D3D12 takes a more flexible approach which is also closer to modern hardware. The translation layer takes care of figuring out which registers a shader needs, managing root signatures, populating descriptor heaps/tables, and setting up null descriptors for unbound resources.
Resource renaming
D3D9 and older have a concept of DISCARD CPU access patterns, where the CPU populates a resource, instructs the GPU to read from it, and then immediately populates new contents without waiting for the GPU to read the old ones. This pattern is typically implemented via a pattern called "renaming", where new memory is allocated during the DISCARD operation, and all future references to that resource in the API will point to the new memory rather than the old. The translation layer provides a separation of a resource from its "identity," which enables cheap swapping of the underlying memory of a resource for that of another one without having to recreate views or rebind them. It also provides easy access to rename operations (allocate new memory with the same properties as the current, and swap their identities).
Resource sub-allocation, pooling, and deferred destruction
D3D9-style apps can destroy objects immediately after instructing the GPU to do something with them. D3D12 requires applications to hold on to memory and GPU objects until the GPU has finished accessing them. Additionally, D3D9 apps suffer no penalty from allocating small resources (e.g. 16-byte buffers), where D3D12 apps must recognize that such small allocations are infeasible and should be sub-allocated from larger resources. Furthermore, constantly creating and destroying resources is a common pattern in D3D9, but in D3D12 this can quickly become expensive. The translation layer handles all of these abstractions seamlessly.
Batching and threading
Since D3D9 patterns generally require applications to record all graphics commands on a single thread, there are often other CPU cores that are idle. To improve utilization, the translation layer provides a batching layer which can sit on top of the immediate context, moving the majority of work to a second thread so it can be parallelized. It also provides threadpool-based helpers for offloading PSO compilation to worker threads (d912pxy 2.4.1 uses a configurable PSO cache amount plus native DX12 caching at the driver level ) Combining these means that compilations can be kicked off at draw-time on the application thread, and only the batching thread needs to wait for them to be completed. Meanwhile, other PSO compilations are starting or completing, minimizing the wall clock time spent compiling shaders.
Residency management (memory management)
This layer incorporates the open-source residency management library to improve utilization on low-memory systems.
The other component here to consider is user hardware.
If someone is already CPU/GPU bound in DX9 natively with GW2 then i wouldn't expect the D912pxy to help.
The end-user also needs a GPU that supports/ is capable of rendering with the DX12 api.
If there's CPU/RAM/GPU/VRAM resources available/untapped, then definitely expect an uplift in performance. Ultimately, try it for yourself and see
I like the part that it goes multicore when rendering.
and by your definition, there's lots of going on with the DX12 and your interface/translation layer says it handles them all meticulously.
but say if the client goes DX9 -> interface/translation layer -> DX12, will the graphics rendering be from DX12 to the interface/translation layer and finally to the GW2 client?
No, the rendering pipeline is GW2 Client, Game DAT, DX9, D912PXY >DX12API - your monitor
@Aaron malouf.9423 said:
Hi, i want to install this but im worried about maybe getting a time out or possibly banned by using this. is it safe to use ?
@alcopaul.2156 said:
so the client hooks up to DX9 which hooks up to an Interface that hooks up to DX12 libraries and from there, you do the reverse just for the client to display graphics?
DX9 is backwards compatible and it is assumed that old software runs fast on new software, right?
Hi
Good question!
I'll give a in-depth technical explanation to answer your question and explain the mechanisms used to go from dx9>translation layer>dx12 render pipeline by using the d912pxy with GW2.
Dx912pxy and the D3D12 transitional layer.
What does this do?
This translation layer provides the following high-level constructs or components (and more) for the GW2 engine to implement in the rendering pipeline.
Resource binding
The D3D12 resource binding model is quite different from D3D9 and prior. Rather than having a flat array of resources set on the pipeline which map 1:1 with shader registers, D3D12 takes a more flexible approach which is also closer to modern hardware. The translation layer takes care of figuring out which registers a shader needs, managing root signatures, populating descriptor heaps/tables, and setting up null descriptors for unbound resources.
Resource renaming
D3D9 and older have a concept of DISCARD CPU access patterns, where the CPU populates a resource, instructs the GPU to read from it, and then immediately populates new contents without waiting for the GPU to read the old ones. This pattern is typically implemented via a pattern called "renaming", where new memory is allocated during the DISCARD operation, and all future references to that resource in the API will point to the new memory rather than the old. The translation layer provides a separation of a resource from its "identity," which enables cheap swapping of the underlying memory of a resource for that of another one without having to recreate views or rebind them. It also provides easy access to rename operations (allocate new memory with the same properties as the current, and swap their identities).
Resource sub-allocation, pooling, and deferred destruction
D3D9-style apps can destroy objects immediately after instructing the GPU to do something with them. D3D12 requires applications to hold on to memory and GPU objects until the GPU has finished accessing them. Additionally, D3D9 apps suffer no penalty from allocating small resources (e.g. 16-byte buffers), where D3D12 apps must recognize that such small allocations are infeasible and should be sub-allocated from larger resources. Furthermore, constantly creating and destroying resources is a common pattern in D3D9, but in D3D12 this can quickly become expensive. The translation layer handles all of these abstractions seamlessly.
Batching and threading
Since D3D9 patterns generally require applications to record all graphics commands on a single thread, there are often other CPU cores that are idle. To improve utilization, the translation layer provides a batching layer which can sit on top of the immediate context, moving the majority of work to a second thread so it can be parallelized. It also provides threadpool-based helpers for offloading PSO compilation to worker threads (d912pxy 2.4.1 uses a configurable PSO cache amount plus native DX12 caching at the driver level ) Combining these means that compilations can be kicked off at draw-time on the application thread, and only the batching thread needs to wait for them to be completed. Meanwhile, other PSO compilations are starting or completing, minimizing the wall clock time spent compiling shaders.
Residency management (memory management)
This layer incorporates the open-source residency management library to improve utilization on low-memory systems.
The other component here to consider is user hardware.
If someone is already CPU/GPU bound in DX9 natively with GW2 then i wouldn't expect the D912pxy to help.
The end-user also needs a GPU that supports/ is capable of rendering with the DX12 api.
If there's CPU/RAM/GPU/VRAM resources available/untapped, then definitely expect an uplift in performance. Ultimately, try it for yourself and see
I like the part that it goes multicore when rendering.
and by your definition, there's lots of going on with the DX12 and your interface/translation layer says it handles them all meticulously.
but say if the client goes DX9 -> interface/translation layer -> DX12, will the graphics rendering be from DX12 to the interface/translation layer and finally to the GW2 client?
No, the rendering pipeline is GW2 Client, Game DAT, DX9, D912PXY >DX12API - your monitor
that's pretty much the flow that i wanted to see.
Your Math Tèacher [MATH]
Digital Headhuntaz [aBrA]
Comments
I wasn't. I was mainly curious. It won't run with my current add-ons anyway (one being GW2 Hook).
Ah, okay.
How much of a FPS boost are we talking about, approximately? (I am currently getting between 30 and 60 FPS during crowded meta events, above 100 otherwise.)
Well, perhaps the GW2 Hook creator will eventually update their DLL or the D912pxy creator will implement ReShade features.
Good to know, thanks.
That's quit a lot! Now I wish it worked with GW2 Hook.
Not that I'm ever having slide shows with my current hardware, but the more FPS the better! 
♀ Gallery: Guild Wars 1 Memories
No more LW episodes, only Expansions and Side Stories, please!
Am I getting this right? This tool does not permanently download the textures/objects but anew everytime I start the game?
And why aren't textures of my characters in the Character Selection Screen loaded from the beginning? I have to enter a map, then return to the CSS in order to see my characters loaded properly.
♀ Gallery: Guild Wars 1 Memories
No more LW episodes, only Expansions and Side Stories, please!
You need to edit your config
Read pso cache =1
Save pso cache =1
Over a few games the shader cache will be built and you won't get the pop ins 😉
I see, thanks. Well, I went back to Gw2hook as I cannot play this game in its original blurry state. Why there are no graphical optimization options within the game itself is beyond me.
P.S. If I used the latest ReShade instead of Gw2hook, would those be compatible with each other?
♀ Gallery: Guild Wars 1 Memories
No more LW episodes, only Expansions and Side Stories, please!
Yes I recommend you use Gshade. Just remember to select DX12 as the api option during the installation process. You may need to copy the dxgi into the gw2 64 bin folder for it to work
https://gposers.com/gshade/
I have a question, since d912pxy creates new shaders for a lot of materials (or all?), wouldn't it be possible to "delete" a shader or two, or replace them with "blanks" so we can have a more particle free game? For example, remove the shader that applies the "storm" effect when you are in places like Thunderhead Keep, or the Frozen Maw and others. Those little snowflakes kill performance and since Anet doesn't want to help the game's performance, perhaps the developer of d912pxy can. If we replace the created shader with an empty one (one that does nothing) it should remove the effect from the screen and provide a massive performance boost.
Hi good question. No unfortunately not as all of those effects form a basis of the games native code. Megai2 could give you a more succinct answer than me in technical terms if you jump on discord and chat with him.
Can anyone help me find the UImask.fx for ReShade and explain to me how to use it properly for GW2, since GW2 Hook is completely outdated and causes graphical issues with fog and occasional game crashes with the latest GPU drivers?
Thanks in advance.
♀ Gallery: Guild Wars 1 Memories
No more LW episodes, only Expansions and Side Stories, please!
I'd jump on the reshade discord channel and check there
So you are telling me that everyone on here who is using ReShade is doing so without a "Skip UI" option?
The text labels are so blurry, it's unbearable to the eye.
♀ Gallery: Guild Wars 1 Memories
No more LW episodes, only Expansions and Side Stories, please!
Can't speak for anyone else but i'm not using it. Text is fine for me on my 1440p monitor. I use DPX, Adaptive Sharpen and Mxao via Gshade & 80% Amd image sharpening filter
Gshade link for those interested
Compatible out of the box with GW2 and the d912pxy
https://gposers.com/gshade/
Edit: Thank you so much ! GShade solved the problem I had with ReShade (re: blurry UI text).
P.S. Is there a way to deactivate the in-game bloom while Post-Processing (in-game) is active?
♀ Gallery: Guild Wars 1 Memories
No more LW episodes, only Expansions and Side Stories, please!
Now the text is crisp, but the windows and icons are way too sharp and the chat's text colors aren't 100% authentic (as they are affected by my HDR and Bloom settings).
How do I properly use the "Skip UI" effect (UImask.fx) for GShade in GW2? Where do I enter the mask's image name so it will be loaded by the UImask.fx?
♀ Gallery: Guild Wars 1 Memories
No more LW episodes, only Expansions and Side Stories, please!
Awesome sauce glad to hear !
There is a way to remove bloom using a custom hsls shader with the d912pxy. Jump on the d912pxy discord channel and ask the guys there
I checked and unfortunately there is no way to Skip UI at this point.
Or you can deactivate the in-game Post-Processing and use GShade's blooming HDR shader instead (which gives you an option to adjust the bloom intensity). The only issue with that is - as mentioned before - that it heavily affects the UI and also loading screens.
So there is no way to make use of the existing UImask.fx shader using a custom made image for GW2, because there is nowhere to link to said image within GShade? (I wonder how the creator of GW2hook did it...)
♀ Gallery: Guild Wars 1 Memories
No more LW episodes, only Expansions and Side Stories, please!
I'd discuss it with the experts on the Gshade discord channel - they are very helpful and they might have a solution !
You mean just directly asking one of the people online? Because last time I looked, they don't have a Discord channel where you can drop questions, nor do they have a forum.
♀ Gallery: Guild Wars 1 Memories
No more LW episodes, only Expansions and Side Stories, please!
You seem resourceful - i'm sure you'll work it out. There's also the GW2 development community channel. There is a reshade group there as well
If you join the d912pxy discord channel you'll see the reshade listed in the # PROJECTS
(Nevermind, I figured out the cause for the glitch I mentioned in this post. It was the remain of a shader I was no longer using - it somehow still appeared in the .ini, oops!)
♀ Gallery: Guild Wars 1 Memories
No more LW episodes, only Expansions and Side Stories, please!
Sneaky shaders !
Using Windows 7. When I update to the newest version of d912pxy (Release v2.3.1), and activate it, I get the following error message on game start:
The previous Version v2.3 works fine. Any clues?
Ascalon Will Prevail!
GW Wiki user page | GW2 Wiki user page
I would log the issue on GITHUB or have a chat with Megai on the d912pxy discord channel. He'll know how to fix it
https://github.com/megai2/d912pxy/issues/318
Posted
Ascalon Will Prevail!
GW Wiki user page | GW2 Wiki user page
Megai has patched this fix for you.
https://ci.appveyor.com/project/megai2/d912pxy/build/artifacts
Yes, yes! I already got it installed and running.
Ascalon Will Prevail!
GW Wiki user page | GW2 Wiki user page
Hi guys. Not sure were I should drop my question so I try here (were else?).
I'm running D912PXY + some other addons trough GW2 Addon Manager, as well as GShade. All this works perfectly.
My problem is that I'm unable to game capture or window capture GW2.exe with OBS Studio. Capture screen is blank, then sometimes OBS crashes. The only way is screen capture, wich is annoying.
Can anyone successfully capture GW2.exe with d912pxy + GShade trough OBS ?
Hi, that would be an issue with OBS specifically and how it hooks into what's being rendered. I'd jump on the d912pxy discord channel and ask there. Otherwise ask the guys who author the OBS software.
Good luck!
OBS has several conflict issues. The one you are experiencing might just be another one.
https://obsproject.com/wiki/Known-Conflicts
This is a bit late but I made my own UIMask .PNG file and replaced the blank one inside the "reshade-shader/textures" folder. You can use a program like kitten (GNU Image Manipulation Program, these forums block the acronym
) and import a screenshot from the game, and then paint over the areas that you use for your static UI elements in white, and the rest of the image in black (I may have the colours reversed). It was tricky but with some tutorials using GnuIMP I found online I got it done.
Then while in game you set "UIMask-Top" at the top of the active effects you don't want to affect your UI, and UIMask-Bottom after said effects.
https://github.com/crosire/reshade-shaders/blob/master/Shaders/UIMask.fx
Hello guys. I solved my problem !
OBS uses BitBlt capture method for the game capture or default window capture. This method doesn't handle hardware accelerated app. The same happens when you try to capture Chromium or Discord for exemple.
You have to capture the game with window capture and select Windows Graphics Capture (WGC) method. The downside is that WGC adds a yellow border to your game window (not to the stream though) and always records the cursor position. This is by design by Windows for security reasons.
Updated original post to reflect pre-release version 2.4 and change notes.
So i had this installed before but it was an old version and I want to update to the newest one. However the zip file I download doesn't include an installer. How do I install the latest release? Windows defender didn't remove it either. It just isn't included.
Hello, i just downloaded a fresh zip of 2.4 and unpacked. The installer file is there. Not sure what is happening on your end !
I don't see install.exe file in new 2.4 build as well even before unpacking it (so it's for sure not deleted by defender):
Edit: NVM - By mistake, I downloaded source code instead of release...
Hi all. For anyone wanting to you reshade with the latest build d912pxy 2.4.1 build without issues, I recommend using the latest Gshade available here.
https://t.co/40fsp8fjEz?amp=1
Just select the DX12 option when installing.
Enjoy !
so the client hooks up to DX9 which hooks up to an Interface that hooks up to DX12 libraries and from there, you do the reverse just for the client to display graphics?
DX9 is backwards compatible and it is assumed that old software runs fast on new software, right?
Your Math Tèacher [MATH]
Digital Headhuntaz [aBrA]
Hi
Good question!
I'll give a in-depth technical explanation to answer your question and explain the mechanisms used to go from dx9>translation layer>dx12 render pipeline by using the d912pxy with GW2.
Dx912pxy and the D3D12 transitional layer.
What does this do?
This translation layer provides the following high-level constructs or components (and more) for the GW2 engine to implement in the rendering pipeline.
Resource binding
The D3D12 resource binding model is quite different from D3D9 and prior. Rather than having a flat array of resources set on the pipeline which map 1:1 with shader registers, D3D12 takes a more flexible approach which is also closer to modern hardware. The translation layer takes care of figuring out which registers a shader needs, managing root signatures, populating descriptor heaps/tables, and setting up null descriptors for unbound resources.
Resource renaming
D3D9 and older have a concept of DISCARD CPU access patterns, where the CPU populates a resource, instructs the GPU to read from it, and then immediately populates new contents without waiting for the GPU to read the old ones. This pattern is typically implemented via a pattern called "renaming", where new memory is allocated during the DISCARD operation, and all future references to that resource in the API will point to the new memory rather than the old. The translation layer provides a separation of a resource from its "identity," which enables cheap swapping of the underlying memory of a resource for that of another one without having to recreate views or rebind them. It also provides easy access to rename operations (allocate new memory with the same properties as the current, and swap their identities).
Resource sub-allocation, pooling, and deferred destruction
D3D9-style apps can destroy objects immediately after instructing the GPU to do something with them. D3D12 requires applications to hold on to memory and GPU objects until the GPU has finished accessing them. Additionally, D3D9 apps suffer no penalty from allocating small resources (e.g. 16-byte buffers), where D3D12 apps must recognize that such small allocations are infeasible and should be sub-allocated from larger resources. Furthermore, constantly creating and destroying resources is a common pattern in D3D9, but in D3D12 this can quickly become expensive. The translation layer handles all of these abstractions seamlessly.
Batching and threading
Since D3D9 patterns generally require applications to record all graphics commands on a single thread, there are often other CPU cores that are idle. To improve utilization, the translation layer provides a batching layer which can sit on top of the immediate context, moving the majority of work to a second thread so it can be parallelized. It also provides threadpool-based helpers for offloading PSO compilation to worker threads (d912pxy 2.4.1 uses a configurable PSO cache amount plus native DX12 caching at the driver level ) Combining these means that compilations can be kicked off at draw-time on the application thread, and only the batching thread needs to wait for them to be completed. Meanwhile, other PSO compilations are starting or completing, minimizing the wall clock time spent compiling shaders.
Residency management (memory management)
This layer incorporates the open-source residency management library to improve utilization on low-memory systems.
The other component here to consider is user hardware.
If someone is already CPU/GPU bound in DX9 natively with GW2 then i wouldn't expect the D912pxy to help.
The end-user also needs a GPU that supports/ is capable of rendering with the DX12 api.
If there's CPU/RAM/GPU/VRAM resources available/untapped, then definitely expect an uplift in performance. Ultimately, try it for yourself and see
I've been using 2.4.1 since release - works a treat. Thanks for the technical info Mack !
I like the part that it goes multicore when rendering.
and by your definition, there's lots of going on with the DX12 and your interface/translation layer says it handles them all meticulously.
but say if the client goes DX9 -> interface/translation layer -> DX12, will the graphics rendering be from DX12 to the interface/translation layer and finally to the GW2 client?
Your Math Tèacher [MATH]
Digital Headhuntaz [aBrA]
No, the rendering pipeline is GW2 Client, Game DAT, DX9, D912PXY >DX12API - your monitor
Hi, i want to install this but im worried about maybe getting a time out or possibly banned by using this. is it safe to use ?
Completely safe to use. 👌
that's pretty much the flow that i wanted to see.
Your Math Tèacher [MATH]
Digital Headhuntaz [aBrA]