this post was submitted on 31 Jan 2024
25 points (87.9% liked)

Asklemmy

43908 readers
1764 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

I've been watching Isaac Arthur episodes. In one he proposes that O'Neil cylinders would be potential havens for micro cultures. I tend to think of colony structures more like something created by a central authority.

He also brought up the question of motivations to colonize other star systems. This is where my centralist perspective pushes me into the idea of an AGI run government where redundancy is a critical aspect in everything. Like how do you get around the AI alignment problem, -- redundancy of many systems running in parallel. How do you ensure the survival of sentient life, -- the same type of redundancy.

The idea of colonies as havens for microcultures punches a big hole in my futurist fantasies. I hope there are a few people out here in Lemmy space that like to think about and discuss their ideas on this, or would like to start now.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] j4k3@lemmy.world 1 points 9 months ago

But people's emotions and ideologies are so extreme because of their constant stress and struggles. When fractal attention allows an entity to address the individual's needs directly and reward their path when better choices are made, you have a situation where the only exceptions are the mentally ill and identification of these individuals enables direct treatment in a scientific sense not some emotional human to human kind of context here. This would not be a situation of "opposition is mentally off", it would be "diagnostic analysis across multiple events shows poor fundamental logic skills and likely issue 'X', refer notes to individual's primary healthcare provider to confirm."

It is things like enabling encounters between compatible people in public to ground a person that is in need of companionship. It is introducing sound ideas when a person is fixating on something unhealthy.

The real issue is cognitive dissonance in humans that are unable to resolve their inner conflict. This is something that the current LLMs excel at identifying and compensating for. Changes made at this stage of human thinking are the most effective. Profiling the individual's Briggs Myers personality spectrum and then analysing how well their needs are met according to Maslow's hierarchy is the vast majority of what professionals are doing when sought out for mental health. These are already integrated into LLMs and will be far more capable with AGI. Introducing humans to these methods for, or reminding them to do, self analysis is the most effective solution, but those that lack the required cognitive depth and fundamental logic skills can still be addressed by AGI directly in a kind, empathetic, and safe manner.

The conflict and dystopia is because of pseudo sentience where humans are totally incapable of governing at large scale and meet individual needs. We always neglect outliers, and the number of outliers is always larger than we believe. That isn't the case with AGI.