this post was submitted on 14 Sep 2023
40 points (95.5% liked)

Asklemmy

43849 readers
615 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

My background is in telecommunications (the technical side of video production), so I know that 30fps is (or was?) considered the standard for a lot of video. TV and movies don’t seem choppy when I watch them, so why does doubling the frame rate seem to matter so much when it comes to games? Reviewers mention it constantly, and I don’t understand why.

you are viewing a single comment's thread
view the rest of the comments
[–] jsdz@lemmy.ml 3 points 1 year ago* (last edited 1 year ago) (1 children)

It comes directly from television. Early home PCs used televisions for displays, and by the 1980s TVs were generally capable of 60 fps (or 50 for regions that used PAL) so that's what the computers generated. Everyone got used to it. And of course like everyone else said you don't want to be adding more latency in games by not keeping up with that basic standard.

[–] SwingingTheLamp@midwest.social 4 points 1 year ago (1 children)

Technically, NTSC video does 60 fields per second (PAL does 50), because the video signal is interlaced. That is, the beam sweeps from top to bottom of the screen 60 times per second, but it only draws half of the horizontal scan lines per sweep, alternating between the odd lines field and the even lines field. That’s why we considered “full motion video” to be 30 frames per second back in the day. The alternating fields did make movement appear smoother, but the clarity wasn’t great.

VGA originally doubled the 15.75kHz horizontal clock rate of NTSC to 31.5kHz, so that the beam was fast enough to draw all of the lines in one vertical sweep, so it can do 60 frames per second with a 60Hz refresh rate. Prior to that, a lot of games were just 30fps, because interlaced video tended to flicker on bitmapped graphics.

[–] jsdz@lemmy.ml 1 points 1 year ago* (last edited 1 year ago)

VGA might've done that to get better resolution at 60 Hz, but I'm pretty sure earlier systems including CGA and the Amiga did 60 fps non-interlaced video at lower resolutions. At least the Amiga also had a higher-resolution interlaced video mode, but it was mostly used for displaying impressive-looking static images.