Shitgenstein1

joined 1 year ago
[–] Shitgenstein1@awful.systems 1 points 3 weeks ago

If anything, it's cringe rather than sneer-inducing. But fun anyway.

[–] Shitgenstein1@awful.systems 4 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

Most of the time, if I'm listening to anything, it's a playlist of Boards of Canada on shuffle - usually while I work.

Otherwise, my music taste is pretty stalled these days wrt new music. Unwound got back together and I saw a couple of their shows, one in LA and another in Austin. The Blood Brothers also reunited and I have a ticket to see them in Dec. Melt-Banana just released a new album I haven't listened to yet.

My friend's band, Yuppie Killer, released their discography on streaming sites. They're a hardcore punk band of expats in South Korea from 2012-2017.

I play in a band, Constellation, and we're sending our first album to be mastered this week. Kind of throwback to 90's post-hardcore like Fugazi (yeah, we got cowbell) and Drive Like Jehu. Really excited about it.

[–] Shitgenstein1@awful.systems 2 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

The details of how the software was programmed should be legally irrelevant.

Why? Programmers should be legally liable for what they program.

[–] Shitgenstein1@awful.systems 4 points 1 month ago

"Surely Google will make a smart decision" is a hill surrounded by corpses.

[–] Shitgenstein1@awful.systems 4 points 1 month ago

Matt Yglesias Considered As The Nietzschean Superman

How is anyone able to read this article beyond this "Charlie Brown Had Hoes"-ass title?

[–] Shitgenstein1@awful.systems 18 points 1 month ago (2 children)

watching Austin Energy capping my A/C and deciding to become a rabid supporter of Yud's call to airstrike data centers

 

Really, it was the headlines of Google's AI Overview pulling Reddit shitposts that inspired the return. If Reddit is going to sell its data to Google, then, you know, maybe flood the zone with sludge?

[–] Shitgenstein1@awful.systems 27 points 2 months ago (11 children)

Christ, there's so much backstory here - just scrolling through long descriptions of Gerard's views and just thinking "based, based, based, based."

[–] Shitgenstein1@awful.systems 7 points 2 months ago

it's an empty paper bag left in an empty parking garage

[–] Shitgenstein1@awful.systems 6 points 2 months ago* (last edited 2 months ago)

There’s currently a loud minority of EAs saying that EA should ostracize people if they associate with people who disagree with them.

people who disagree with them.

Oh, it's racists. The vague description is because it's racists. It's a woke cult now because some people don't want to associate with racists.

[–] Shitgenstein1@awful.systems 2 points 3 months ago

tap a well dry as ye may, I guess

[–] Shitgenstein1@awful.systems 15 points 3 months ago

Before we accidentally make an AI capable of posing existential risk to human being safety

It's cool to know that this isn't a real concern and therefore in a clear vantage of how all the downstream anxiety is really a piranha pool of grifts for venture bucks and ad clicks.

[–] Shitgenstein1@awful.systems 23 points 3 months ago (17 children)

A year and two and a half months since his Time magazine doomer article.

No shut downs of large AI training - in fact only expanded. No ceiling on compute power. No multinational agreements to regulate GPU clusters or first strike rogue datacenters.

Just another note in a panic that accomplished nothing.

 

Someone I was following on TikTok, whose takes on tech industry bullshit and specifically AI hype I respected, made a video that Roko's basilisk is a serious concern. My apologies to those who have been in this same situation when I was less sympathetic.

 
 

Eliezer Yudkowsky @ESYudkowsky If you're not worried about the utter extinction of humanity, consider this scarier prospect: An AI reads the entire legal code -- which no human can know or obey -- and threatens to enforce it, via police reports and lawsuits, against anyone who doesn't comply with its orders. Jan 3, 2024 · 7:29 PM UTC

view more: next ›