this post was submitted on 16 Jun 2024
632 points (95.3% liked)
Greentext
4392 readers
1101 users here now
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
- Anon is often crazy.
- Anon is often depressed.
- Anon frequently shares thoughts that are immature, offensive, or incomprehensible.
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Can someone please explain why CRT is 0 blur and 0 latency when it literally draws each pixel one-by-one using the electron ray running across the screen line-by-line?
Because it draws those "pixels" as the signal reaches the monitor. When half of a frame is transmitted to a CRT monitor, it's basically half way done making it visible.
An LCD monitor needs to wait for the entire frame to arrive, before it can be processed and then made visible.
Sometimes the monitor will wait for several frames to arrive before it processes them. This enables some temporal processing. When you put a monitor in gaming mode, it disables (some of) this.
If that's how TFTs worked we wouldn't have vsync settings in games.
No? Afaik vsync prevents the gpu from sending half drawn frames to the monitor, not the monitor from displaying them. The tearing happens in the gpu buffer Edit: read the edit
Though I'm not sure how valid the part about latency is. In the worst case scenario (transfer of a frame taking the whole previous frame), the latency of an lcd can only be double that of a crt at the same refresh rate, which 120+ hz already compensates for. And for the inherent latency of the screen, most gaming lcd monitors have less than 5 ms of input lag while a crt on average takes half the frame time to display a pixel, so 8 ms.
Edit: thought this over again. On crt those 2 happen simultaneously so the total latency is 8ms + pixel response time (which I don't know the value of). On lcds, the transfer time should be (video stream bandwidth / cable bandwidth) * frame time. And that runs consecutively with the time to display it, which is frame time / 2 + pixel response time. Which could exceed the crt's latency
BUT I took the input lag number from my monitor's rtings page and looking into how they get it, it seems it includes both the transfer time and frame time / 2 and it's somehow still below 5 ms? That's weird to me since for that the transfer either needs to happen within <1 ms (impossible) or the entire premise was wrong and lcds do start drawing before the entire frame reaches them
Although pretty sure that's still not the cause of tearing, which happens due to a frame being progressively rendered and written to the buffer, not because it's progressively transferred or displayed