Check prices on Amazon below
AMD Ryzen 5800X3D: geni.us/LBUXw
AMD Ryzen 7900X3D: geni.us/wHvf
AMD Ryzen 7950X3D: geni.us/MAJurv
Intel 13600K: geni.us/n81NOb
RTX 4090: geni.us/UEXBx7i
RTX 4080: geni.us/6mmd2Q
RTX 4070 Ti: geni.us/omnm
A deep-dive on PC latency. A quest for the lowest input-lag PC currently possible.
need a new wallpaper? optimum.store
0:00 - Input lag intro
1:34 - GPU choice
4:34 - Nvidia Reflex
6:00 - CPU choice
7:49 - AMD 7950X3D tuning
9:27 - Intel 13900K tuning
10:06 - 4090 vs. 7900 XTX
10:50 - Reccomendations
Video gear
Camera: geni.us/5YfMuy
Primary Lens: geni.us/pWnoPBr
/ optimum
/ optimumtechyt
/ only_optimum
As an Amazon Associate I earn from qualifying purchases.
Very interesting indeed, haven't seen similar analysis elsewhere and it's great it includes older hardware that in reality is what the majority of us are still using. Impressive how, the now old, 5800X3D keeps up. Great work, keep it up
LTT recently did similar "pay to win (low latency) pc vs budget pc" video but it was actually bad video. Only Linus played games but he didn't focused at all and tried to make fun rather than analyzing.
@@ugur3527 I wouldn't take anything Linus says or does seriously.
@@griffin1366 for real in his recent sony 1000xm5 headphones review he straight up didn't read the instructuins and then complained that the features didn't activate for his incorrect inputs
i think his MW2 numbers are wrong, or at the very least misleading. the 7xxx3d chips are showing 30% faster than the 5800x3d in actual gameplay in mw2 competitive settings.
"now old".... dude, it's just about 10 months old.💀
Not entirely related to the content of the video but your graphs & visualizations have become among the cleanest and most elegant out of almost all other similar YT channels. Very clean and beautiful to look at, impressive work!
In games like Valorant and CSGO it would also be interesting to see the 0.1% lows. I noticed that when games stutter, sometimes the eye can notice it, but it's too short to be reflected in 1%, but the 0.1% catches it.
Exactly! The 0,1 Percent really makes a difference. It's always the situations where most things happen and that decide whether you win or lose.
@@reinulf656 Haha, that reminds me that I never was able to find a conclusive answer about what exact hardware is needed for 2k 240hz FPS gaming to have it completely smooth (basically 0.1% of over 240 fps). Did some quick maths to make sure that my 0.1% is correct: a noticeable micro-stutter (or a FPS drop in general) that lasts one third of a second can happen once every 6 minutes of your gameplay for the 1% low to ignore it. 10 micro stutters per hour might not be too much for some, but I think even if they don't happen in important moments, they still might annoy you and throw you off.
@@_APV_ exactly thats what matter most in competitive titles, even Warzone has terrible 0.1 lows with decent 1% and completly ruins gameplay
@@_APV_ You're gonna need how to learn to manually tune your ram's subtimings to have a chance of that. Ram latency tuning overwhelming and significantly increasingly benefits the slowest % frames, ie 5% lows a lot, 1% lows by even more and 0,1% by the most. It is by far the biggest practical benefit you get from overclocking any current gen part.
@@Frozoken Cool, never heard of that phenomena, thanks for sharing. You mean even OC on CPU and GPU will improve 0.1% lows more than it will the average? Will play with that when I finally build myself a PC.
Really dope exploration into latency, loved the b roll 🔥
You got all of these specs and your internet is 1000 ms ping 😑
Subscribed. Top tier content creation . Thank you!
Nice video as always ! Thank you !
wonderful work dude, thanks so much!
Nice job. Excellent synopsis. thumbs up
Very informing. Thank you!
So i never comment videos, but here's my first exception. Really enjoy your stuff, well put together and explained. On this vid, I would have liked to know what OS you ran the tests on, as there was some issues with utilizing e-cores and CPU scheduling with Win11, that perhaps have been resolved by now? But more importantly, I feel like the other half of the latency story is frame consistency(frame pacing/timing) and the annoying stuttering it can produce. Would have loved to see something about frametime observations between these hardware combinations as well, even if its just "combinations make zero difference". But thanks for the content, and keep up the good work!
Thanks for clearing the concepts!
Great work, thank you for all the testing
Another great review, but the comparisons and results are excellent, I also appreciate your efforts, thank you 🥳👋
Awesome video as always , keep up the good work ! Also will you ever review gaming laptops on your channel ? those are becoming really interesting nowadays.
Thank you so much for doing these tests.
Great video as always 🔥
Would be nice to also test the newer OLEDs to verify the input lag claims and compare it to higher refresh rate monitors
Pretty sure it's a load of bs. I saw someone say they measured the lag and got like 3ms and said it was terrible and he could feel it... No human is noticing that
Those claims of 0.03ms isn't input lag btw if that's what you're referring to it's just basically how fast the pixels can change colour, massively reducing ghosting and motion blur.
The monitors aren't as accurate as LDAT / other tools but are within margin of error.
The Main issued is that you can have 40ms Input lag Difference by switching the GPU, If you Go from LCD tonoped you might save 5 to 10 at max
Hardware Unboxed does test that and the claims are, of course, accurate (on the same refresh rate), but going from 2ms input lag to 0.3ms isn't really something you can feel.
We can thank youtuber battlenonsense for the Nvidia reflex feature, it was him whom found the added latency when the GPU is at 100% and he also let Nvidia know about it.
He didn't find it, he just had enough pull that NVIDIA actually acknowledged him.
Personally noticed this back in 2014 when I first built and got into pc gaming. Even made multiple forum threads about it, and according to most of the "experts" on there, it was something that was not supposed to happen.
My team knew everything about framebuffers / prerender-limit / flipqueue-size around 2005. We kept it silent.
Great info, great video. Love the content.
good job bro always wondered about this stuff
I love these testing videos. Its something that I would do. Thanks for making these videos
very informative. thank you for your dedicated research!
Thanks for the video! It's fun to see the 5800X3D still kicking ass and taking names with the next generation X3D chips.
Super informative, very good point ! Would be very interesting to include the 7800X3D when it is out.
I love how you always surprise me with interesting videos that I didn’t know I wanted.
great video, nice infos! very compact
you nailed it, i'm a huge experienced computer user, and i never activated reflex or other tricky things just because we can't measure the impact and you never know if you made it worse than "native". imagine 99% of the people, they directly don't even know what was that.
Best video regarding pc latency to date. Really good stuff!
very good video :) have never even thought about this. i would be interested in how much u cab´n improve latency by capping framerate to lower gpu usage and at which point its getting u no gains in latency due to less frames
I think it'd be very interesting if the data was normalized for framerate... maybe capping the fps in game, or doing math with the data, after the fact.
Yeah it's pretty much just seeing the effect framerate has on the input lag which makes using all these different gpu kinda useless outside of looking pretty on a chart.
@@Touma134 it's for comparison with commonly used gpu's. If it just showed "this how the 4090 and 7950 xtx perform" then that wouldn't mean anything to 99% of people because there is no reference point.
Great video. Can you summarize your ram timings for Ryzen?
Thanks for the interesting info there :)
Man I’ve been waiting for your water cooled 4090 formd t1 build. Can’t wait to see that.
typo in timestamps, recommendations. brilliant video as always
having the lowest latency possible is so immersive in simracing it's underrated . at 360km/h every 10ms of lag is 1 meter . great video
Autistic take
@@zappyFPS 😁
Get an OLED the 240Hz from Asus at 1440P 27" have 0.03 m/s against IPS that is 1.00 m/s.
@@winnb4968 Lmao that's just marketing. That 240hz Asus will be closer to 4-6ms and the OLED will be below 1ms, somewhere between 0.15-0.5ms.
@@winnb4968 that's not the input lag . but when i'm uppgrading my screen setup i am thinking about 48 or 55 inch oled tv's . they are 0,1ms GTG
Incredible content, love seeing testing surrounding esports titles that frankly no one else is doing, at least at remotely this kind of quality and thoroughness!
Awesome video. Would be great if you could include some 1080p low settings results also in the future (Ow2!) :)
True
Thank you for that very informational video 👍 👏
One thing I would like to see, that is missing here, is what would happen to your latency if you CAP your FPS slightly below your 1% lows.
I am still learning and would like to know more. Why would you do this? What do you think the results would be?
Similar results to reflex, but marginally more latency because of capped fps. But it would be more consistent
I am glad someone else mentioned this.
Capping fps to slightly less of average gives you the lowest input lag.
I was wondering this too.
Would be interesting to see any network latency tuning that helps. Disable RSS benchmarks... etc
Thank you for measuring input latency on esport games I get tired of seeing bigger bar better when it’s 100’s of fps. Input lag and 1% lows are the two most important metrics when talking esport gaming
Nice vid bro!
awsome vid man. ty.
For the record, ULL is disabled when Reflex is enabled in any given title. In case you were thinking they worked together. He kind of made it look that way for some reason.
Aware, just a good idea to have them both enabled when the game doesn’t support reflex 👍
@@optimumtech What would you say is the fastest ram for the 5800x3D?
@@PerfectFitGE8 I have one. I think they run up to about 3800MHz. My kit is 3600MHz, which worked fine until I started using 4 dimms and had stability issues so now I run it at 3400Mhz. RAM speed doesn't matter so much on x3d because of the extra cache, anything above 3000 should be fine i think :)
@@PerfectFitGE8 The G.Skill F4-4000C14D-32GTZN or the cheaper 3600C14 version of it. Set Infinity Fabric and RAM 1:1 to 3733
@@PerfectFitGE8 3600CL14 or 3200CL14/3600CL16 are the best kits. Remember to set the infinityfabric to half of the MT/s of the kit.
nice video great content you are putting out there keeü up the great work
Very good. Thank you!
nice video, though one test which I think would be interesting is to limit the FPS of the 4090 and 7900XTX to the same frame rate (as high as the lowest one can produce) and see how the frame times are.
He could but it would be pointless for real world statistics. Nobody is going to willingly limit the amount of FPS they can get when they're spending thousands on the hardware. Would be cool to compare but ultimately useless
@@marcnhunter I limit my FPS for games like Overwatch and CSGO to 162 or so FPS so my GPU isn't needlessly wasting energy. Runnning 600fps is pointless. Also capping your FPS reduces input lag. Frame capping eliminates the render queue being sent to the GPU. Capping to your refresh rate minus 3-5fps + G-Sync is the best for latency. Consistent latency = better muscle memory and aim tracking.
Funny how I willingly limit the amount of FPS I can get while spending thousands on hardware so that I have competitive consistency in my latency@@marcnhunter
Very good review and very good graphics to represent the collected data. I am interested if you do a video for AMD non cache and cache CPUs, how the memory speed and timings are affecting performance. And the main questins is if the 3d cache CPUs benefit from faster memory. Thanks!
Would love to have an analisys on the RAM aswell when it comes to ram speed AND latency for example ddr4 with cl15 vs ddr5 with cl36
And 7200 RAM speed and beyond for intel.
Not much of a difference at all tbh
Great data!
These graphs are great and very easy to understand, what software are you using?
Amazing video!
Great video, would love to know what monitor stand you are using!
With process lasso u can let the game run on the vcash while running the background tasks on the other ccd for optimal perfomance for the 7950x3d huge difference
Interesting with this different take on performance.
fantastic job
hey Optimum Tech first of all thanks for your work i enjoy every video! one question: perfomred a 7800x3d similar to a 7950x3d or so even better in latency at overwatch 1440? for example 4.3 ms system latency or better?🙂
Very interesting video! I wonder how Intel Arc performs on latency.
In the 4090 vs 7900XTX test, was reflex enabled on the 4090?
This test but everything on low and all the GPUs capped to like 144fps would be interesting
comment for the algo. superb video! plz make more itx builds!
Didn't know about enabling low latency mode, thanks!
For games that don't support NVIDIA's reflex + boost, and if you are GPU-bound.
Thank you! Loved this video. Something that is bugging me is, why do you think we didn't see the same scaling w/ the increase in FPS related to the CPU that we did from the GPU?
You see the same issue with most gaming benchmarks. The CPU has an easy task, which is quickly handing over the task of image generation to the GPU. The CPU task takes less time even when you have a weaker CPU, the GPU task is the bigger burden and takes longer even in high end GPUs. So maybe the CPU takes 1.5ms for its job, the GPU takes 3ms. Graphics intensive applications simply rely a lot more on the GPU, and especially in these performance tests where the GPU is basically being maxed out on purpose (to compare max. FPS capacity), the GPU will always be the bottleneck. Of course, many games are optimized to utilize the GPU as much as possible, as this is more efficient for rendering graphics. If these tests would set the framerate to 60 FPS and then test the lag, you would see different results, as you lose the benefit of higher FPS, and then the better GPU benefit would look more similar to the better CPU benefit. Another reason surely is that for most applications, the single core performance of the CPU is the most important factor, and this changes only very slightly between CPU generations - most progress in CPU performance is due to more cores being added. Since Pentium 4 days (near 4GHz), single core performance simply did not develop as it did up until then (we are at 5.5GHz in the high end now). Since graphics intense applications requires a lot of data flowing through the CPU, we see the X3D processors with 96 MB L3 cache win a lot. The additional memory speeds up the CPU's job. So there is only so much to do for the CPU, and it is probably not maxed out in such tests, and the best way to go faster is to have better single core performance or more cache, both of which is scarce. The GPU is being maxed out, and hence the most important factor.
It would be incredibly informative if you also can make something regarding same specs software settings/tweaks for the LEAST input lag .. nvidia settings and or windows tweaks and custom build windows versions
Me with 650 ms reaction time: I must get these and I'll dominate.
Add 3 beers to this. Turn on ranked game in Counter Strike and annoy everyone that they have an aimbot :D
650ms? Are you in your 80's?
I"m slowish, and i get around 280, you must be lower than 650 -- maybe if you're counting ping too -- which would make sense!~
Would be nice to do a comparision with a frame rate cap. This basically just shows that more FPS gives less latency which is fair. But are there differences if we're capped to for ex. 225fps in apex?
Optimum, you do measuring of performances that no one else does and you are one of the only pro gamer youtubers that is enough into tech to put the effort in of this level. Insanity. Love your content, dedication and minimalist style.
One thing I’m a bit confused is that if it’s the newer architecture or higher framerate newer cpus & gpus have that causes lower input lag. For instance, if the 1060 and the 3090ti were capped at the same fps, (say 60), would the 3090ti still have lower input latency? If it doesn’t, you could argue that lowering all settings and playing at the lowest resolution possible should decrease input latency, and switching hardwares, or even comparing hardwares, is not necessary, if frame rate is the main factor.
Yes thank you this is wt I was looking for
I love these tests! Would it be possible to check 1080p gaming with games on Low/Esports settings to see how input lag is affected at very high framerates and gpu % usage?
Yes it is possible. Go ahead and do it yourself.
@@thetranya3589 Bruh, you expect us to have these GPU's to test?
@@thetranya3589 🤡
Yeah we need the 1080p tests
What about the 7900X3D? I know it’s a bit slower than the 50 counterpart but would be nice with data from that too
Very impressive. Can do you make video about power consumption? And RTX 4090 didn't melted yet?
Not only do I like your content, I also like that unlike other similar creators we can actually tell your in good shape. Keep up the great vids man.
Good stuff cheers.
I'm just wondering was sam enabled on the 7900 xtx during it's latency test?
What is your primary monitor for gaming?
Many motherboards support PBO for 5800X3D since the latest bios update, would be great if you include results with it next time.
Just add 2-3% and there you have results w CO. No need to work people to death for some every irrelevent, miniscule detail you can think of. (If your still in doubt i have the 5800X3d myself and can confirm, CO does help keep the chip at locked 4.45ghz (max) instead of the usual 4.1-4.2ghz, in gaming that translates to just small, miniscule gains of less than 5%)
@@ok.1590 Tried BCLK OC? Might get like 3-4% more.
Yes! Exactly what i did on my B550-E gaming
@@ok.1590 I think he is having fun gathering all this data.
Mine does not, but i was able to use the PBOTuner2 with CO -30 and it is working flawlessly after was able to enable the automation of the exe via the scheduler.
God bless you my man
Hard-hitting, critical information that no one else is covering. Superb.
Did you run these tests on Valorant with the different GPUs?
this shouldve included RAM optimization and how does different ram speeds with overlocks affect PC latency! The only thing I was wanting to see in this video
im interested in why you dont test everything on lowest settings? in an environment where all were interested in is latency / frames, why use medium / high settings? especially for esports titles, i e overwatch. i and most others who care about latency in esports titles are using pretty much the lowest of lows, so i feel like it would only make more sense
If you're really concerned about latency you should get one of those old good CRT monitors you got rid of in early 2000s. Not that I really recomend doing so but this will probably give you the biggest leap in decreasing the lag as those displays have absolutely zero input lag.
Oled is expensive, but it's also really good for input lag without looking like crap.
We have 500 fps monitors today )some of them even deliver). That will be significantly better than even 120Hz CRT with 0 input lag, just because you save about 7ms per image from GPU to being on the display.
The power consumption of the 7900 xtx is less than the 4090 btw
What is the best mouse skates for Logitech super light I play rust and I am ordering Saturn pro and artisan zero mouse pad plz help
you are the goat best tech channel undoutably
It makes more sense to set all settings to low. I see you tested some games on medium / high settings. Competitive gamers who care about input latency will always put every on low and play at 1080p or lower, and therefor are mostly CPU limited. They will rarely are almost never GPU limited like in your tests.
you should talk more about RAM latency and oc memory. You could easily push the latency down with good, overclocked memory.
So lower latency is directly correlated with high fps. And ddr5 with good timing can improve upon latency VS ddr4?
And now pair it with true esport screen, not 1440p but 1080p, either zowie XL2566K or new 480/500Hz screens, set the games to CO MPETITIVE settings (half low details for highest fps, disabled effects for competitive advantage, enabled only effects giving you an advantage), and you will see that then any mid range GPU is enough and CPU/RAM/display matter more. The GPU latency is simple, whenever the GPU render queue is full from more than 90 percent, the input lag skyrockets. So reducing details and resolution results in huge savings on gpu, which are now ridiculously expensive. Also most esports are CPU limited and not GPU limited, csgo, league of legends, dota, partially fortnite, rainbow six, pubg, they are all CPU bottlenecked (some only at the recommended competitive settings, some at all settings) and GPU doesnt help much unless you have some really garbo level GPU. Im playing these games at budget 280hz Asus vg249qm and im planning to only upgrade my CPU in next months if the (not) coming source 2 update doesnt bring me more fps in some maps in csgo which is my CPU struggling with to deliver 280 stable minimal fps. Btw you didnt try to lock fps at some sensible amount, this does miracles with GPU latency as the render queue is never spammed with infinite frame requests.
Doesn't Reflex for Nvidia and Anti-lag for AMD's make the fps cap obsolete? I'm pretty sure that's what he's saying, and has presented as well in earlier videos on input lag.
Damn, even then 1080p is too high lol. I mean....look at the res csgo players play at!
This
We need 1080p benchmarks
Hi Optimum Tech, can I know what type of wifi antennae do you used. Is pulse w5029 good or you have other recommendations?
How did you analyse latency of the 7900xtx? Does reflex analyser work with amd gpus?
gods work 👌
I used to have 200ms latency on my old laptop. Really really hated it but didn’t even notice the latency till I got things better.
could you also test if an all core OC setup is really better for system latency? than for example vs 7700x pbo -30 setup? my opinion one of the biggest pc myths ever.
So the highest end components deliver the best results. Mindblowing
Point was to see if there was any difference between AMD and Nvidia and AMD and Intel. Everyone knows the higher the fps the lower the latency.
most underrated thang! it makes me mad how much peepoo overlook ds
how did you benchmark ow2 exactly? I just ran some benchmarks using PresentMon and my 1%lows are absolutely identical (0.31fps diff) 🤔 13700K with 5.7 core HT off, 50 ring ratio vs 5.6p core HT on, 4.4e core 48 ring ratio. Using 6400CL32
What about the lowest Power consumption but having the highest framerate above 60fps on 1440p or 4K ?