this post was submitted on 03 Feb 2025
13 points (100.0% liked)

TechTakes

1604 readers
113 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

last week's thread

top 50 comments
sorted by: hot top controversial new old
[–] ibt3321@lemmy.blahaj.zone 4 points 1 hour ago

If only the Supreme Court had as much of a spine as the Romanian Constitutional Court

Law prohibiting driving after consuming substances declared as unconstitutional. Of course, the antidrug agency bypassed the courts and parliament to pass it anyway.

See also: Constitutional Court cancels election result after Russian interference

[–] dgerard@awful.systems 7 points 3 hours ago (1 children)
[–] khalid_salad@awful.systems 7 points 3 hours ago

AI alignment is literally a bunch of amateur philosophers telling each other scary stories about The Terminator around a campfire

I love you, David.

[–] khalid_salad@awful.systems 16 points 5 hours ago* (last edited 5 hours ago) (1 children)

I hate LLMs so much. Now, every time I read student writing, I have to wonder if it's "normal overwrought" or "LLM bullshit." You can make educated guesses, but the reasoning behind this is really no better than what the LLM does with tokens (on top of any internalized biases I have), so of course I don't say anything (unless there is a guaranteed giveaway, like "as a language model").

No one describes their algorithm as "efficiently doing [intermediate step]" unless you're describing it to a general, non-technical audience


what a coincidence


and yet it keeps appearing in my students' writing. It's exhausting.

Edit: I really can't overemphasize how exhausting it is. Students will send you a direct message in MS Teams where they obviously used an LLM. We used to get

my algorithm checks if an array is already sorted by going through it one by one and seeing if every element is smaller than the next element

which is non-technical and could use a pass, but is succinct, clear, and correct. Now, we get^1^

In order to determine if an array is sorted, we must first iterate through the array. In order to iterate through the array, we create a looping variable i initialized to 0. At each step of the loop, we check if i is less than n - 1. If so, we then check if the element at index i is less than or equal to the element at index i + 1. If not, we output False. Otherwise, we increment i and repeat. If the loop finishes successfully, we output True.

and I'm fucking tired. Like, use your own fucking voice, please! I want to hear your voice in your writing. PLEASE.


1: Made up the example out of whole-cloth because I haven't determined if there are any LLMs I can use ethically. It gets the point across, but I suspect it's only half the length of what ChatGPT would output.

[–] mountainriver@awful.systems 7 points 1 hour ago (1 children)

My sympathies.

Read somewhere that the practice of defending one's thesis was established because buying a thesis was such an established practice. Scaling that up for every single text is of course utterly impractical.

I had a recent conversation with someone who was convinced that machines learn when they regurgitate text, because "that is what humans do". My counterargument was that if regurgitation is learning then every student who crammed, regurgitated and forgot, must have learnt much more than anyone thought. I didn't get any reply, so I must assume that by reading my reply and creating a version of it in their head they immediately understood the errors of their ways.

[–] Soyweiser@awful.systems 2 points 58 minutes ago

I had a recent conversation with someone who was convinced that machines learn when they regurgitate text, because “that is what humans do”.

But we know the tech behind these models right? They dont change their weights when they produce output right? You could have a discussion if updating the values is learning, but it doesnt even do that right? (Feeding the questions back into the dataset used to train them is a different mechanic)

[–] NextElephant9@awful.systems 4 points 6 hours ago (3 children)

I don't know if this is the right place to ask, but a friend from the field is wondering if there are any examples of good AI companies out there? With AI not meaning LLM companies. Thanks!

[–] froztbyte@awful.systems 3 points 1 hour ago* (last edited 1 hour ago)

sounds a bit of a xy question imo, and a good answer of examples would depend on the y part of the question, the whatever it is that (if my guess is right) your friend is actually looking to know/find

“AI” is branding, a marketing thing that a cadaverous swarm of ghouls got behind in the upswing of the slop wave (you can trace this by checking popularity of the term in the months after deepdream), a banner with which to claim to be doing something new, a “new handle” to use to try anchor anew in the imaginations of many people who were (by normal and natural humanity) not yet aware of all the theft and exploitation. this was not by accident

there are a fair of good machine learning systems and companies out there (and by dint of hype and market forces, some end up sticking the “AI” label on their products, because that’s just how this deeply fucked capitalist market incentivises). as other posters have said, medical technology has seen some good uses, there’s things like recommender[0] and mass-analysis system improvements, and I’ve seen the same in process environments[1]. there’s even a lot of “quiet and useful” forms of this that have been getting added to many daily use systems and products all around us: reasonably good text extractors as a baseline feature in pdf and image viewers, subject matchers to find pets and friends in photos, that sort of thing. but those don’t get headlines and silly valuation insanity as much of the industry is in the midst of

[0] - not always blanket good, there’s lots of critique possible here

[1] - things like production lines that can use correlative prediction for checking on likely faults

[–] mii@awful.systems 3 points 3 hours ago

The only thing that comes to mind is medical applications, drug research, etc. But that might just be a skewed perspective on my end because I know literally nothing about that industry or how AI technology is deployed there. I've just read research has been assisted by those tools and that seems, at least on the surface level, like a good thing.

[–] FredFig@awful.systems 4 points 5 hours ago

There are companies doing "cool-sounding" things with AI like Waymo. "Good" would require more definition.

[–] sc_griffith@awful.systems 12 points 23 hours ago (3 children)
[–] Soyweiser@awful.systems 9 points 19 hours ago* (last edited 19 hours ago)

"De colonized" is also on there, that will give some interesting problems when automated filters for this hit Dutch texts (De means the).

E: there are so many other words on there like victim, and unjust, and equity, this will cause so many dumb problems. And of course if you go on the first definition of political correct ('you must express the party line on certain ideas or be punished') they created their own PC culture. (I know pointing out hypocrisy does nothing, but it amuses me for now).

[–] jonhendry@iosdev.space 11 points 21 hours ago (1 children)

@sc_griffith

Apparently "Trauma" is banned. That's going to be a problem.

This is what happens when you give power to bigoted morons.

[–] jonhendry@iosdev.space 12 points 21 hours ago (1 children)

@sc_griffith

Note: They're all problems. Just “Trauma" is kind of extra-important because of its use as a medical term.

Trauma surgery, Barotrauma, Traumatic Brain Injury, Penetrating Trauma, Blunt Trauma, Abdominal Trauma, Polytrauma, Etc.

[–] Soyweiser@awful.systems 10 points 19 hours ago (2 children)

Victim and Unjust are also there, which lawyers prob love to never be able to use.

[–] bitofhope@awful.systems 7 points 4 hours ago (1 children)

I don't think "victim" is really a word that's even used especially much in "woke" (for a lack of a good word) writing anyway. Hell, even for things like sexual violence, "survivor" is generally preferred nomenclature specifically because many people feel that "victim" reduces the person's agency.

It's the rightoid chuds who keep accusing the "wokes" for performative victimhood and victim mentality, so I suppose that's why they somehow project and assume that "victim" is a particularly common word in left-wing vocabulary.

[–] Soyweiser@awful.systems 3 points 1 hour ago

Good point, had not even thought of that. Shows how badly they are at understanding the people they are against. Reminds me how they went, a while back going after the military for actually reading the 'woke' literature. Only the military was doing it explicitly so they would understand their enemies, so they could stop them.

[–] jonhendry@iosdev.space 6 points 5 hours ago (2 children)

@Soyweiser

I’m not sure lawyers file for many NIH grants, but “victim" probably comes up in medical/science research. Pathology would be one possibility.

[–] Soyweiser@awful.systems 2 points 1 hour ago

I was already assuming the pc word list would spread to other subjects.

[–] jonhendry@iosdev.space 6 points 5 hours ago

@Soyweiser

A quick pubmed search finds NIH such supported research as:

"In 2005 the genome of the 1918 influenza virus was completely determined by sequencing fragments of viral RNA preserved in autopsy tissues of 1918 victims”

Insights on influenza pathogenesis from the grave. 2011, Virus Research

"death of the child victim”

Characteristics, Classification, and Prevention of Child Maltreatment Fatalities. 2017 Military Medicine

Etc

[–] jonhendry@iosdev.space 11 points 22 hours ago

@sc_griffith

"Gender Neural”

That typo is probably going to screw a lot of Neuroscience grants just because it'll match on some dumb regex.

Also, apparently Hispanic and Latino people don't exist?

[–] self@awful.systems 10 points 23 hours ago (2 children)

courtesy of 404media: fuck almighty it’s all my nightmares all at once including the one where an amalgamation of the most aggressively mediocre VPs I’ve ever worked for replaces everything with AI and nobody stops them because tech is fucked and the horrors have already been normalized

[–] JFranek@awful.systems 9 points 13 hours ago

“Both what I’ve seen, and what the administration sees, is you all are one of the most respected technology groups in the federal government,” Shedd told TTS workers. “You guys have been doing this far longer than I've been even aware that your group exists.”

(emphasis mine)

Well, maybe start acting like it champ.

[–] Soyweiser@awful.systems 9 points 19 hours ago (2 children)

Minor note, but Musk wears that jacket everywhere, even at a suit and tie dinner with Trump. I don't get how Trump (if my assumptions on him wanting to be seen as a certain type of having class/higher society type) stands him. Looking at the picture, he might be having second thoughts all the time.

[–] jonhendry@iosdev.space 4 points 3 hours ago

@Soyweiser

Is that a Members Only jacket?

[–] gerikson@awful.systems 9 points 9 hours ago (1 children)

I bet that jacket is some super-nerdy thing that William Gibson once mentioned in a book

[–] cstross@wandering.shop 9 points 9 hours ago

@gerikson He probably bought it BECAUSE Gibson wrote about it in a novel and he thinks it makes him look cool and special.

[–] skillissuer@discuss.tchncs.de 14 points 1 day ago (3 children)

at long last, we have found genai use case

result: decisive chinese cultural victory

tumblr screencap of 4chan screencap with text: deepseek is the only algorithm that does my fetish with gusto no questions asked. as a result i have installed a portrait of the honorable chairman xi in my goon cave

[–] sc_griffith@awful.systems 15 points 23 hours ago (1 children)

+6 culture generation for each person gooning under a portrait of xi jinping. china's borders will expand quickly

[–] froztbyte@awful.systems 10 points 22 hours ago

ah yes, content from the well-known community mod Cursid Meier

[–] Soyweiser@awful.systems 8 points 1 day ago

He who controls the goons controls the universe. No wait that is only in EVE online.

load more comments (1 replies)
load more comments
view more: next ›