Architeuthis

joined 1 year ago
[–] Architeuthis@awful.systems 7 points 8 months ago (2 children)

I wonder how much of that family fortune has found its way into EA coffers by now.

[–] Architeuthis@awful.systems 11 points 8 months ago

Basically their only hope is that an AI under their control takes over the world.

They are pretty dominant in the LLM space and are already having their people fast tracked into positions of influence, while sinking tons of cash into normalizing their views and enforcing their terminology.

Even though they aren't trying to pander to religious americans explicitly, their millenialism with the serial numbers filed off worldview will probably feel familiar and cozy to them.

[–] Architeuthis@awful.systems 9 points 8 months ago (4 children)

Wasn't he supposed to be a romantic asexual at some point?

[–] Architeuthis@awful.systems 33 points 8 months ago (11 children)

This was such a chore to read, it's basically quirk-washing TREACLES. This is like a major publication deciding to take an uncritical look at scientology focusing on the positive vibes and the camaraderie, while stark in the middle of operation snow white, which in fact I bet happened a lot at the time.

The doomer scene may or may not be a delusional bubble—we’ll find out in a few years

Fuck off.

The doomers are aware that some of their beliefs sound weird, but mere weirdness, to a rationalist, is neither here nor there. MacAskill, the Oxford philosopher, encourages his followers to be “moral weirdos,” people who may be spurned by their contemporaries but vindicated by future historians. Many of the A.I. doomers I met described themselves, neutrally or positively, as “weirdos,” “nerds,” or “weird nerds.” Some of them, true to form, have tried to reduce their own weirdness to an equation. “You have a set amount of ‘weirdness points,’ ” a canonical post advises. “Spend them wisely.”

The weirdness is eugenics and the repugnant conclusion, and abusing bayes rule to sidestep context and take epistimological shortcuts to cuckoo conclusions while fortifying a bubble of accepted truths that are strangely amenable to allowing rich people to do whatever the hell they want.

Writing a 7-8000 word insider expose on TREACLES without mentioning eugenics even once throughout should be all but impossible, yet here we are.

[–] Architeuthis@awful.systems 14 points 8 months ago

Yeah, a lot of these TESCREAL exposés seem to lean on the perceived quirkiness while completely failing to convey how deeply unserious their purported scientific and philosophical footing is, like virgin tzatziki with impossible gyros unserious.

[–] Architeuthis@awful.systems 5 points 8 months ago (20 children)

Something like a weekly general topic thread would work great for this I think.

[–] Architeuthis@awful.systems 10 points 8 months ago* (last edited 8 months ago)

Greece has no groups focused on improving fish welfare however, Charity Entrepreneurship is possibly starting a charity focused on advocating for fish welfare improvements Greece.

EAs advocating for fish welfare is about the only thing yet to be seen in this country, awesome.

If you want to influence Greece just use some of the fabled EA obsquatumatillions to buy out the left part of our two party system, they're currently in such shambles they'd barely notice, and it's not like they could do much worse with shrimp rights as a flagship issue.

[–] Architeuthis@awful.systems 7 points 9 months ago (2 children)

Looking through the reddit thread, the whole 'Peter Miller has great recall' thing feels off, like it's less an excuse for shoddy preparation and more a genuine grievance that he kept his superior memory ~~genes~~ skills purposefully hidden so they couldn't sent someone who had rolled equal or better brain stats to the debate.

This is in response to PM himself showing up in the thread to say rootclaim actually had his presentation 24 days in advance because the debate was delayed once:

This is true. I think the point is more that, even having seen all your own and your opponents information, a debater with greater recall / working memory can potentially "win" even if their argument is weaker.

Like, of course they lost, mere facts are nothing when the opponent has the IQ advantage, this is how the AI demons get us.

[–] Architeuthis@awful.systems 12 points 9 months ago (1 children)

Judge 1 says they, just, uh, decided some past disease outbreaks were lab leaks and never looked to see if that’s actually scientifically debated or just weird Rat Accepted Truths.

Yes but see, an anthrax environment containment breach almost five decades ago at a sprawling soviet weapons facility with biosecurity protocols that consisted of a wink and a handshake totally strengthens our claim that in 2019 some chinese intern accidentally shot up a gm bat virus and wandered off. Bayesian inference is unintuitive like that, you plebs wouldn't understand.

[–] Architeuthis@awful.systems 26 points 9 months ago (3 children)

birdsite stuff:

A rationalist organization offered a James Randi-style $100k prize to anyone who could defeat them in a structured longform debate and prove COVID had a natural origin, so a rando Slate Star Codex commenter took them up on it and absolutely destroyed them. You won't believe what happened next (they wrote a pissy blogpost claiming the handpicked judges had "errors in ... probabilistic inference" for not agreeing with their conclusion and grew even more confident in their incorrect opinion)

[–] Architeuthis@awful.systems 2 points 9 months ago (4 children)

Last behind the bastards episode is this article expanded. Robert Evans is always very listenable and the more detailed CES reporting is interesting, but if you are a member here you probably won't be adding anything new to your TREACLES lore.

I wish journalists referencing the basilisk would go a in a bit more in depth, it's so much dumber than than it seems at a brief glance. Like, a lot of people immediately assume the alleged scary part is that we might already be living in the simulation and thus be eligible for permanent residence in basilisk Hell should we commit the cardinal sin of shit-talking AI, but no; the reason you can go to AI hell is because of transhumanist cope.

As in, if your last hope for immortality is brain uploads, you are kinda cornered into believing your sense of self gets shared between the physical and the digital instance, otherwise what's the point? EY appears to be in this boat, he's claimed something like there's no real difference between instances of You existing in different moments in time sharing a self and you sharing a self with a perfect digital copy, so yeah, it's obviously possible, unavoidable even.

As to how the basilisk will get your digital copy in the first place, eh, it'll just extrapolate it perfectly from whatever impression's left of you in the timeline by the time it comes into being, because as we all now, the S in ASI stands for Fucking Magical, Does Whatever It Wants. Remember, ASI can conjure up the entirety of modern physics just by seeing three frames of an apple falling, according to Yud.

[–] Architeuthis@awful.systems 1 points 10 months ago

Galton Ehrlich Buck

The concentrated smarm in this bullshit JAQ off piece gave me psychic damage.

Fun to see him using the "IQ is mostly genetic [because heredity]" line, which is exactly what the schizophrenia literature he takes issue with claims is a woefully inadequate descriptor if we're going to usefully evaluate what is actually happening.

The way they always try to motte and bailey eugenics gives me the shits. No, eugenics isn't screening embryos for terrible incurable conditions, it's the whole deal of gatekeeping society according to arbitrary geneological norms, and the fact that they keep trying to rehabilitate the term instead of rebranding to something less awful, is certainly food for thought.

view more: ‹ prev next ›