Jump to content
  • Sign Up

Monitor advice


Recommended Posts

Hi

So my system is currently an older one. I7 3770k 8gb ram and a GTX 970oc. I am currently using a older 42" tv at 1080p , 60hz.

I have been looking at some new monitors and tvs with higher refresh rates and resolutions.

Since I cannot upgrade my computer($). I was thinking of upgrading my monitor to a 1440p or 4k 120hz. Is my gtx970 too slow? Is it worth it? If anyone has a similar setup, any help would be appreciated. With xmas coming there will surely be sales as well.

Link to comment
Share on other sites

The computer is too slow. At this point, a 3770 is 9 years old. GW2 uses CPU more than the graphics card. a 4K monitor has 4 times as many pixels to move about as a 1080p monitor (you could play at 1080p, but then what is the point of buying a 4K monitor?) You should get some tool that currently shows what the refresh rate the game is running out - I suspect right now you are always below 60 fps, so a faster monitor won't do you any good either.

Link to comment
Share on other sites

i play on an i5 2500k rx 580 rig and my monitor is 3440 x 1440 ultrawide. I play on Medium settings. In an empty space with no detail, i can hit 100 FPS. in a city with lots of detail like the grove, i get about 40 fps. In lion's arch during the halloween event, I was at about 17 to 22 FPS. In 3v3, i get a good 40 to 50 in combat, outside of combat i'm at 60 fps. In Conquest 5v5, I get about 20 to 30 fps. IDK about wvw, my guess i'd get abaout 5 to 10 fps in a big team fight. Hope this helps you.

Link to comment
Share on other sites

The CPU has nothing to do with the resolution, so it is irrelevant for the question, whether your PC can handle 1440 or 2160p.

Your graphics card is too slow for maximum details at 2160p/60 on the newer maps in the game. For 1440p/60 it is fast enough.

Whether your CPU can compute the game logic at 60 fps, so it doesn't bottleneck your graphics card is another story. More often than not it can't.

Nevertheless 120/144Hz will improve your gaming experience as it lowers your input lag even when you play at lower framerates, because your monitor can display a new frame each 8/7 instead of 16 milliseconds. And then there is adaptive sync, that eliminates all the vsync tradeoffs. I play the game on a 144Hz monitor at 60fps (frame limiter) with adaptive sync (freesync/gsync) and this is miles better than playing at 60 fps/60Hz with vsync. The input lag difference between these two setups in gw2 is about 100ms. That's huge.

Link to comment
Share on other sites

Hmmm. The i7 3770k and 970 combo doesnt have much bottleneck.I didn't really think about input lag. 100ms is alot.I haven't had the opportunity to use freesynch or gsynch but looking at the math it should be significantly better than 60/60hz. It should also lessen the GPU load as well. I have used DSR and 1440 even with supersampling was playable however the font scaling ruined it for me.

I was thinking this one would help with performance as well as make the game prettier and for a reasonable price.

https://www.bestbuy.ca/en-ca/product/asus-34-1440p-uwqhd-75hz-4ms-gtg-va-lcd-freesync-gaming-monitor-vp348qgl/14882032?cmp=knc-s-71700000072655680&gclid=CjwKCAiAkan9BRAqEiwAP9X6UcbZOfUKz9eK2Q0dem4eNgNjyKD8xmj5d_VAGr5T9l5Xj3_zicNF4xoCQCUQAvD_BwE&gclsrc=aw.ds

Link to comment
Share on other sites

Since you can't upgrade your GPU i suggest not going for 4k monitors, it won't work well.

But... If you get a cheap 2k one like this one:https://pcmonitors.info/reviews/aoc-q3279vwfd8/

...or simmilar ones, you can get a more clear experience that's better than the one you have now (since it's 2k), and a cheap solution until you can upgrade your gfx card and then maybe get a 140Hz monitor to go with that.

Think of it as an "inbetween" monitor if you want something better now, but can't afford a whole new system. That way, you can save for a new system and game on a slightly better monitor. But the "upgrade" might be marginal, so if you can, definitely get a better system first, then get a better monitor.

Because, on your current graphics card, anything above 75Hz will be a waste because i doubt you'll be able to get such high frame rates, especially if you go above 1080p.

Link to comment
Share on other sites

The input lag, vsync, freesync all gets somewhat complicated. If you are able to sustain 60 fps, it means your lag is around 17 ms (1000 / 60). If your FPS is 20, it means it is only drawing one frame every 50 ms (1000/20). Freesync doesn't really change that - what it changes is that it can draw that frame whenever it is ready, instead of waiting for the next refresh cycle (which worse case at 60 fps is 17 ms)I updated from a I7-3770 with a gtx 970 to a i7-9700 with a 1070 a couple years ago - performance is tremendously better. With the I7, big fights would start dropping to <5 fps. Now, while I may get some drops, at least things are still quite playable.If you want to get a new monitor for other reasons (doubles as a TV set and you want a 4K TV), then you could certainly go for it, and just keep playing GW2 at 1080p. But otherwise, definitely update that 9 year old CPU.

Link to comment
Share on other sites

You probably won't notice details more than about 1080p on your average 20 inch or so monitor anyway. The higher resolutions are for large format TVs which normally have very poor pixel density, and need higher resolutions to not look blocky.

Try not to buy into gamer jargon rhetoric, Most of it has no technical basis, and perception is often a placebo.

If you want better details use ReShade instead.

Link to comment
Share on other sites

Already use reshade. Gaussian blur can do wonders for old textures :) i am a big of a graphics nerd :)

I think a new system will be coming in the new year. Now the big question lol Amd or Intel(Nvidia). The last time I used an amd card was a 6700(i think) that I was able to flash the bios to a 6800. My old x6 cpu was cool but multithreading wasn't as prevalent as now.

Kinda thinking of going Amd this time. Seems single core performance issues have been rectified. Hopefully the Amd GPU driver software has improved.

Link to comment
Share on other sites

@Solvar.7953 said:The input lag, vsync, freesync all gets somewhat complicated. If you are able to sustain 60 fps, it means your lag is around 17 ms (1000 / 60). If your FPS is 20, it means it is only drawing one frame every 50 ms (1000/20). Freesync doesn't really change that - what it changes is that it can draw that frame whenever it is ready, instead of waiting for the next refresh cycle (which worse case at 60 fps is 17 ms)I was talking about vsync and vsync uses a render ahead queue of a few frames. That's where one part of the input lag comes from.

Additionally you lower the the input lag drastically, when you limit the fps below your your GPU limit, as this eliminates lag originated in the game's rendering pipeline between the CPU and GPU. In a 100% GPU limit the GPU can not keep up with the frame delivery by the CPU, which causes lag. But on the other side a CPU limit causes bad frame pacing. So if you limit the fps, then you get the best of both worlds and neither the CPU nor the GPU are at 100% load and you get the lowest input lag possible and a great frame pacing.

The typical input lag increase of triple buffered vsync at 60 fps/hz is between 60 and 100 milliseconds, depending on the specific game. But GW2 is a game at the upper end of the spectrum. Everyone can test that himself ingame by moving the camera around with vsync on and vsync off. The difference is huge.

And an adaptive sync monitor is basically a must buy for everyone that wants to arrive in 21st century video gaming standards. A fixed refresh rate monitor causes either tearing and stuttering (vsync off) or lag and stuttering (vsync on). I feel bad for console players at this point - they get an ugly stutter for every single dropped frame due to fixed refresh rate output.

Link to comment
Share on other sites

  • 3 months later...

Necroing my old post.I caved and have a shiny new i7 9700 with a Rtx 2080 super.

I am thinking 2k monitor now at 144mhz. Since I have been at 1080p since time immemorial my kinda big question is will the 2k resolution help "fix" some of the bad textures in gw2.

Any other advice on reshade settings and presets would be more than appreciated as well !

Link to comment
Share on other sites

A monitor is usually something that survives several PC systems so it is worth getting a decent monitor.

Some things to consider:

  1. GW2 will rarely hit the high refreshrate you are looking for. A high-refreshrate monitor may still be worth the money as future proofing or if you play FPS games.
  2. IPS and VA panels have good color reproduction and viewing angles. Avoid TN panels if you care about image quality.
  3. Ultrawide screens of 3440x1440 and better are awesome to experience the beautiful landscapes of Tyria.
Link to comment
Share on other sites

@"Stormcrow.7513" said:my kinda big question is will the 2k resolution help "fix" some of the bad textures in gw2.The game creates the textures, not the monitor. You can run a game from 1999 in 4K and the textures will still be poor.

Besides that the texture quality of Guild Wars 2 is not even high enough to even use the potential of a 1080p monitor. The game runs fine with graphics cards with 2 GB of video memory. To put that into perspective:

To receive an overal texture quality that exceeds the resolution of a 1080p display, you need a good 6 to 8 GB of video memory. True 4k textures require 16GB of VRAM. It doesn't exist a single game currently, that has such good textures, so the high resolution is only for pixel clarity (less flickering of small details and a sharper look of shader effects), but not texture quality.

Link to comment
Share on other sites

  • 4 weeks later...

Thanks for the responses guys.

GW2 does have some bad textures yes but the question was will those bad textures look better on a 4k display? I know i will probably never get those frames in GW2 so future proofing is on order.I use Supersampling and a mixture of Gaussian blur as well as a few sharpness injectors to give the illusion currently of finer textures or pixel clarity I suppose.

Anyone playing the game at 4k currently that has any opinion or advice?

Thanks again all for the comments and help :)

Link to comment
Share on other sites

@Stormcrow.7513 said:Thanks for the responses guys.

GW2 does have some bad textures yes but the question was will those bad textures look better on a 4k display? I know i will probably never get those frames in GW2 so future proofing is on order.I use Supersampling and a mixture of Gaussian blur as well as a few sharpness injectors to give the illusion currently of finer textures or pixel clarity I suppose.

Anyone playing the game at 4k currently that has any opinion or advice?

Thanks again all for the comments and help :)

I've hardly ever seen bad textures on GW2. A high-res monitor like 4k will allow you to see far better texture detail further away. But you also need a powerful GPU to render the game at 4k at acceptable frame rates. A GTX 970 will struggle at that resolution.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...