this post was submitted on 31 Jan 2024
55 points (96.6% liked)

Linux

48083 readers
785 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

What do package managers do? Install packages, obviously! But that is not everything. In my opinion, package managers do enough to be characterized as general automation frameworks. For example:

  • manage configurations and configuration files
  • manage custom compilation options and flags
  • provide isolation or containerization
  • make sure a specific file is present in a specific place given specific conditions
  • change installation files or configuration based on architecture or other conditions

Not all package managers do all of the above, but you get the idea.

Nix even manages your entire setup with a single configuration file.

It occurred to me that package management could theoretically be managed by an automation framework.

What do I mean by automation framework? Ansible, chef, puppet, or Sparrow.

Now imagine if you were to use one of those package managers as an automation framework. For most of them, it would suck. (nix is a notable exception). So maybe common package managers are just bad automation frameworks?

What if we used an automation framework as a package manager? Well currently, it might also suck, but only because it lacks the package definitions. Maybe it is not a bad experiment to have a distribution managed by a modern automation framework like Sparrow.

What are the benefits?

  • usable on other distributions
  • more easily create your own packages on the fly
  • Greater customization and configurability
  • use a known programming language that is easy to read to define packages and other functions, instead of a DSL
  • your package manager can easily automate just about any task using the same syntax and framework
you are viewing a single comment's thread
view the rest of the comments
[–] MajorHavoc@programming.dev 4 points 9 months ago (1 children)

I think a big part of what you're getting at is that we (as humanity) deserve a pull-model, human readable, platform independent, outcome descriptive / desired state / idempotent computer configuration standard (RFC).

Quite a few package managers and orchestration tools almost get us there. Ironically, since package managers have been around longer, on average, many of them scratch that itch as well or better than many of the orchestration platforms.

I've done it both ways, many times, and I currently prefer a poor-fit-to-task orchestration tool, over an excellent fit-to-task installer package. And I'll take a well written Makefile over both of the others combined. That may just be my "old man yells at cloud" energy, though.

[–] bitwolf@lemmy.one 2 points 9 months ago (1 children)

Funny you brought up make files.

We've been churning though Java technical debt for the past year and a huge pain point is that a lot of configuration gets lost within intelliJ.

Most of this is env vars and jvm args. These could be wonderfully documented using an .env.example and a well written makefile.

As a middle career technology professional just before reading this comment and thread I had the thought:

"Make files really are the only correct way to distribute software".

Even with OCIs and soon Wasm Components, a makefile can still cover the constant changes in development trends. They can also wire together bash scripts used for tasks and maintenance. Bash and make really are some of the best swiss army knives we have.

[–] corsicanguppy@lemmy.ca 0 points 9 months ago

We’ve been churning though Java technical debt for the past year and a huge pain point is that a lot of configuration gets lost within intelliJ.

You've lost Single Source of Truth.

“Make files really are the only correct way to distribute software”.

Nope. They're the best way to build your immutable artifacts. Building packages, though, should not be done inside your makefile but by the packaging layer that should sit outside the makefile: horse before the cart and never after. I say this knowing that older/crippled packaging formats and processes do this wrong. We have 3 decades of knowledge to leverage and we still get drek like IP5, but it's a revelation to understand you need to keep the layers distinct.