this post was submitted on 16 Jun 2024
632 points (95.3% liked)

Greentext

4392 readers
1386 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] frezik@midwest.social 23 points 5 months ago* (last edited 5 months ago) (2 children)

Of course there's buffers. Once RAM got cheap enough to have a buffer to represent the whole screen, everyone did that. That was in the late 80s/early 90s.

There's some really bad misconceptions about how latency works on screens.

[–] HackerJoe@sh.itjust.works 8 points 5 months ago

Those are on the graphics adapter. Not in the CRT.
You can update the framebuffer faster than the CRT can draw. That's when you get tearing. Same VSync then as now.

[–] __dev@lemmy.world 2 points 5 months ago (1 children)

CRTs (apart from some exceptions) did not have a display buffer. The analog display signal is used to directly control the output of each electron gun in the CRT, without any digital processing happening in-between. The computer on the other end however does have display buffers, just like they do now; however eliminating extra buffers (like those used by modern monitors) does reduce latency.

[–] frezik@midwest.social -1 points 5 months ago

Doesn't matter. Having a buffer means either the buffer must be full before drawing, or you get screen tearing. It wasn't like racing the beam.