TinyTimmyTokyo

joined 1 year ago
[–] TinyTimmyTokyo@awful.systems 1 points 11 months ago* (last edited 11 months ago)

She seems to do this kind of thing a lot.

According to a comment, she apparently claimed on Facebook that, due to her post, "around 75% of people changed their minds based on the evidence!"

After someone questioned how she knew it was 75%:

Update: I changed the wording of the post to now state: 𝗔𝗿𝗼𝘂𝗻𝗱 𝟳𝟓% 𝗼𝗳 𝗽𝗲𝗼𝗽𝗹𝗲 𝘂𝗽𝘃𝗼𝘁𝗲𝗱 𝘁𝗵𝗲 𝗽𝗼𝘀𝘁, 𝘄𝗵𝗶𝗰𝗵 𝗶𝘀 𝗮 𝗿𝗲𝗮𝗹𝗹𝘆 𝗴𝗼𝗼𝗱 𝘀𝗶𝗴𝗻*

And the * at the bottom says: Did some napkin math guesstimates based on the vote count and karma. Wide error bars on the actual ratio. And of course this is not proof that everybody changed their mind. There's a lot of reasons to upvote the post or down vote it. However, I do think it's a good indicator.

She then goes on to talk about how she made the Facebook post private because she didn't think it should be reposted in places where it's not appropriate to lie and make things up.

Clown. Car.

[–] TinyTimmyTokyo@awful.systems 2 points 11 months ago

What a bunch of monochromatic, hyper-privileged, rich-kid grifters. It's like a nonstop frat party for rich nerds. The photographs and captions make it obvious:

The gang going for a hiking adventure with AI safety leaders. Alice/Chloe were surrounded by a mix of uplifting, ambitious entrepreneurs and a steady influx of top people in the AI safety space.

The gang doing pool yoga. Later, we did pool karaoke. Iguanas everywhere.

Alice and Kat meeting in “The Nest” in our jungle Airbnb.

Alice using her surfboard as a desk, co-working with Chloe’s boyfriend.

The gang celebrating… something. I don’t know what. We celebrated everything.

Alice and Chloe working in a hot tub. Hot tub meetings are a thing at Nonlinear. We try to have meetings in the most exciting places. Kat’s favorite: a cave waterfall.

Alice’s “desk” even comes with a beach doggo friend!

Working by the villa pool. Watch for monkeys!

Sunset dinner with friends… every day!

These are not serious people. Effective altruism in a nutshell.

[–] TinyTimmyTokyo@awful.systems 1 points 11 months ago

People who use the term "race realism" unironically are telling on themselves.

[–] TinyTimmyTokyo@awful.systems 1 points 11 months ago

Reading his timeline since the revelation is weird and creepy. It's full of SV investors robotically pledging their money (and fealty) to his future efforts. If anyone still needs evidence that SV is a hive mind of distorted and dangerous group-think, this is it.

[–] TinyTimmyTokyo@awful.systems 0 points 11 months ago (2 children)

"Fucking probabilities, how do they work?"

[–] TinyTimmyTokyo@awful.systems 1 points 1 year ago (9 children)

The first comment and Yud's response.

[–] TinyTimmyTokyo@awful.systems 1 points 1 year ago* (last edited 1 year ago)

Roko's authoritative-toned "aktshually..." response to Annie's claims have me fuming. I don't know why. I mean I've known for years that this guy is a total boil on the ass of humanity. And yet he still manages to shock with the worst possible take on a topic -- even when the topic is sexual abuse of a child. If, like Roko, I were to play armchair psychiatrist, I'd diagnose him as a sociopath with psychopathic tendencies. But I'm not. So I won't.

[–] TinyTimmyTokyo@awful.systems 1 points 1 year ago (1 children)

My attention span is not what it used to be, and I couldn't force myself to get to the end of this. A summary or TLDR (on the part of the original author) would have been helpful.

What is it with rationalists and their inability to write with concision? Is there a gene for bloviation that also predisposes them to the cult? Or are they all just mimicking Yud's irritating style?

Is it wrong to hope they manage to realize one of these libertarian paradise fantasies? I'd really love to see how quickly it devolves into a Mad Max Thunderdome situation.

[–] TinyTimmyTokyo@awful.systems 0 points 1 year ago (1 children)

What's it like to be so good at PR?

[–] TinyTimmyTokyo@awful.systems 1 points 1 year ago (1 children)

Stephen Jay Gould's The Mismeasure of Man is always a good place to start.

[–] TinyTimmyTokyo@awful.systems 1 points 1 year ago* (last edited 1 year ago) (4 children)

This is good:

Take the sequence {1,2,3,4,x}. What should x be? Only someone who is clueless about induction would answer 5 as if it were the only answer (see Goodman’s problem in a philosophy textbook or ask your closest Fat Tony) [Note: We can also apply here Wittgenstein’s rule-following problem, which states that any of an infinite number of functions is compatible with any finite sequence. Source: Paul Bogossian]. Not only clueless, but obedient enough to want to think in a certain way.

Also this:

If, as psychologists show, MDs and academics tend to have a higher “IQ” that is slightly informative (higher, but on a noisy average), it is largely because to get into schools you need to score on a test similar to “IQ”. The mere presence of such a filter increases the visible mean and lower the visible variance. Probability and statistics confuse fools.

And:

If someone came up w/a numerical“Well Being Quotient” WBQ or “Sleep Quotient”, SQ, trying to mimic temperature or a physical quantity, you’d find it absurd. But put enough academics w/physics envy and race hatred on it and it will become an official measure.

view more: ‹ prev next ›