this post was submitted on 17 Apr 2024
1452 points (98.6% liked)

Programmer Humor

19544 readers
569 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] zarenki@lemmy.ml 2 points 6 months ago

Unix time is far less universal in computing than you might hope. A few exceptions I'm aware of:

  • Most real-time clock hardware stores datetime as separate binary-coded decimal fields representing months, days, hours, minutes, and seconds as one byte each, and often the year too (resulting in a year 2100 limit).
  • Python's datetime, WIN32's SYSTEMTIME, Java's LocalDateTime, and MySQL's DATETIME similarly have separate attributes for year, month, day, etc.
  • NTFS stores a 64-bit number representing time elapsed since the year 1601 in 100-nanosecond resolution for things like file creation time.
  • NTP uses an epoch of midnight 1900-01-01 with unsigned seconds elapsed and an unusual base-2 fractional part
  • GPS uses an epoch of midnight 1980-01-06 with a week number and time within the week as separate values.

Converting between time formats is a common source of bugs and each one will overflow in different ways. A time value might overflow in the year 2036, 2038, 2070, 2100, 2156, or 9999.

Also, Unix time is often managed with a separate nanoseconds component for increased resolution. Like in C struct timespec, modern *nix filesystems like ext4/xfs/btrfs/zfs, etc.