this post was submitted on 24 Jun 2024
66 points (100.0% liked)
TechTakes
1400 readers
118 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's not so much about the doomers being sure that AGI will lead to human extinction (or worse) The point is that even if the chances of it are extremely slim, the consequences can be worse than we're even capable of imagining. The question is: do we really want to take that chance?
It's kind of like with the trinity nuclear test. Scientists were almost 100% confident that it wont cause a chain reaction that sets the entire atmosphere on fire but when we're speaking about the future of the entire humanity I don't blame people for arguing that almost 100% certainty is not good enough.
Why when we look into the stars do we not see a sign of life anywhere else? Has life not emerged yet or has it wiped itself out? With what? Nukes? AI? Synthetic viruses made with AI? Who knows..
Personally I think that stopping AI recearch is not an option. It's just not going to happen. The asteroid is already hurtling towards earth and most people don't seem to experience any sort of urgency due to it. Do we not need to worry about it yet if the time of impact is 30 years from now?
EDIT: Alright, well this community was a mistake..
dude, we need to survive the climate change first. and i mean, as a species. first things first.
Some idiot in another forum opined that LLMs haven't solved climate change "yet". Sure, bud.
But they have worked out how to make it go faster! Now we just need to run it in reverse!