I'm 100% sure if the majority of people in here claiming they see the difference were actually tested, they'd fail it. Something like
- 60Hz, 120Hz, 144Hz, 165Hz, 200Hz
- multiple game scenes and clips:
- varying FPS ranging from 29 to 320fps
- quiet and busy (not much stuff happening vs a lot of stuff happening)
- slow and fast camera or background movements
Take the Cartesian product of that for all the different possibilities and play them a random set thereof. Maybe 20 or so.
It's just like screen resolution. If you sit at arms length or further away from your screen (which you should) and increase the resolution of your screen, everything becomes smaller (icons, text, images). That means you'll have to scale them up to be at the same size as when they were at a lower resolution.
Also, at a certain distance, you become unable to spot details of a certain size --> you physically will not be able to see the different between 1080p, 2k, and 4k from that distance. It's called visual acuity. I bet you, if you put did a similar test as above with video resolution, screen resolution, screen size, and distance from screen, the majority would start do much worse than they think they can.
It's mostly marketing and "bigger number = better" think.