this post was submitted on 14 Sep 2023
40 points (95.5% liked)

Asklemmy

43849 readers
615 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

My background is in telecommunications (the technical side of video production), so I know that 30fps is (or was?) considered the standard for a lot of video. TV and movies don’t seem choppy when I watch them, so why does doubling the frame rate seem to matter so much when it comes to games? Reviewers mention it constantly, and I don’t understand why.

you are viewing a single comment's thread
view the rest of the comments
[–] jsdz@lemmy.ml 1 points 1 year ago* (last edited 1 year ago)

VGA might've done that to get better resolution at 60 Hz, but I'm pretty sure earlier systems including CGA and the Amiga did 60 fps non-interlaced video at lower resolutions. At least the Amiga also had a higher-resolution interlaced video mode, but it was mostly used for displaying impressive-looking static images.