this post was submitted on 02 Aug 2023
1540 points (97.0% liked)
Programmer Humor
19463 readers
156 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think I really only use GUIs if I am learning something new and trying to understand the process/concepts or if I'm doing something I know is too small to automate. Generally once I understand a problem/tool at a deeper level, GUIs start to feel restrictive.
Notable exceptions are mostly focused around observability (Grafana, new relic, DataDog, etc) or just in github. I've used gh-dash before but the web ui is just more practical for day to day use.
For context, I'm in SRE. I feel like +90% of my day is spent in kubernetes, terraform, or ci/cd pipelines. My coworkers tend to use Lens but I'm almost exclusively in kubectl or the occasional k9s.
Searching a log file? I want
less
. Searching all log files? I want log aggregation lol.If I knew what I was looking for I could grep all the log files and pipe the output to another file to aggregate them.
The problem is that they're all on different servers. Once you use log aggregation stuff like DataDog, Splunk, or Kibana you get it, but before it's hard to see the benefits. Stuff like being able to see a timestamp of when an error first appeared and then from the same place see what other stuff happened around the same time.
If I had dozens or hundreds of servers that would make a huge difference, but for under a dozen I think the cost of setting that all up isn't worth the added benefit. Plus if the log aggregation goes down (which I've seen happen with some really hairy issues) you're back to grepping files so it's good to know how.
Totally. I'm talking more from the enterprise perspective. Even apart from that I'm not sure if the cost is worth it at that scale. Even using foss solutions the dev hours setting it up might not be worth it.
One log file, or all, I want grep or awk, maybe with find in front, possibly throw some jq on top if something is logging big json blobs.
That's a lot slower at scale than something like Loki.
I feel you. The problem with a lot of Elastic style document search engines is that they don't ever let you search by very explicit terms because of how the index is built. I believe the pros outweigh the cons but I often wish I could "drop into" grep, less, and others from within the log aggregation tool.
Exactly.
Github's UI is total garbage compared to basic git commands, though.
You can't manage pull requests, github actions, repo collaborators, permissions, or any number of the dozens of other things github does just from basic git commands.