this post was submitted on 30 Jun 2024
13 points (71.0% liked)

Showerthoughts

29698 readers
1290 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. A showerthought should offer a unique perspective on an ordinary part of life.

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. Avoid politics
    1. NEW RULE as of 5 Nov 2024, trying it out
    2. Political posts often end up being circle jerks (not offering unique perspective) or enflaming (too much work for mods).
    3. Try c/politicaldiscussion, volunteer as a mod here, or start your own community.
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct-----

founded 1 year ago
MODERATORS
 

Think about it. The more you learned about your brain the more complex it would become. Also I'm not sure we have enough capacity to reflect on ourselves by processing everything. Think about the massive complexity of every connection. Could someone actually process that or are we limited by ourselves?

top 14 comments
sorted by: hot top controversial new old
[–] Tattorack@lemmy.world 7 points 4 months ago (1 children)

No. My genius is so vast it boggles my own mind. Sometimes it scares me.

[–] possiblylinux127@lemmy.zip 1 points 4 months ago

But from a quantum perspective you can't remember your own brain.

[–] breadsmasher@lemmy.world 6 points 4 months ago (2 children)

why do you think learning about your brain cause it to become more complex?

[–] toxoplasma0gondii@feddit.de 3 points 4 months ago

I think the logic here is supposed to be something like:

more knowledge = more complex brain

as you need to "store" the knowledge somewhere.

I don't say thats how it works but i can somehow see where OP is coming from.

[–] ivanafterall@lemmy.world 2 points 4 months ago

Yeah, I'm pretty sure learning more about myself would make me simpler by definition.

[–] DarkDarkHouse@lemmy.sdf.org 5 points 4 months ago* (last edited 4 months ago)

Depends what you mean. If I’m understanding you, then no, you’d be dealing with some kind of metadata recursion problem. On the other hand, on my hard drive I have a file detailing the schematic of the drive.

[–] benderbeerman@lemmy.world 2 points 4 months ago* (last edited 4 months ago)

He who truly knows, knows that he knows not

Or

The more you know, the more you realize you don't know.

But also, understanding that about yourself doesn't take more knowledge or intelligence than you currently have. As another user (darkdarkhouse)^ posted, you can understand how the brain works and still not have all the knowledge in the world (hard drive with a map/directory of the hard drive on it).

[–] BackOnMyBS@lemmy.autism.place 2 points 4 months ago (1 children)

If whatever learning meant +1 brain complexity then it we would never catch up because it would be the infinite hotel. example: learning one unit of brain complexity adds one unit of brain complexity.

if learning meant < +1 brain complexity, then the next limiters would be brain space and time. example: learning one unit of brain complexity slightly complicates an already existing complex.

if learning meant > +1 brain complexity, then the more you learn, the farther you get from understanding the whole thing. example: learning one unit of brain complexity requires the addition of another unit of brain complexity plus its relationship with other complexes.

[–] fubbernuckin@lemmy.world 4 points 4 months ago* (last edited 4 months ago)

Probably < 1 considering you can pick up on patterns, and learning a pattern generates 1 one structure while allowing you to understand many. The learned pattern itself is likely stored within another pattern. You likely won't be able to know everything within the brain at once, but you might be able to find anything you want to know.

It'd be like memorizing every book in a library versus going through a library catalog to get what you need.

[–] muntedcrocodile@lemm.ee 2 points 4 months ago

1mm^3 is a couple terabytes in jyst connections rhe largets ai models as of present are nowhere even near that.

[–] meekah@lemmy.world 1 points 4 months ago* (last edited 4 months ago)

I don't think brains get more complex because of additional information.

Also, what I'm not sure you mean with having "enough capacity to reflect ourselves by processing everything"? Do you mean analyzing another brain or analyzing your own brain? Do you mean processing every time a neuron fires, and which neurons get what kind of stimulus from it? Or do you mean looking at all molecules/atoms/fundamental particles and considering their state and following reaction?

I think we should be capable of understanding the concepts that make our brains work eventually. But I don't think we have the capacity to monitor everything another brain is doing and understand/interpret it in real time, let alone our own brain. A computer might be able to at some point, when we understand more about neurology.

[–] remotelove@lemmy.ca 1 points 4 months ago (1 children)

I have been to some interesting places in my own brain with the help of psychedelics. While I have experienced different levels of self-awareness, the possibility of that "filling up my brain" is likely not possible.

There is a part of the brain that psychedelics specifically affect that functions as kind of a traffic regulator. It typically only directs signals from one part of the brain to another part. Psychedelics open these pathways up and allow for information to flow to in all kinds of directions. (Synesthesia, sensory confusion, is an example: Feeling colors and seeing sounds.)

What the experience does for me, is that I seem to gain more awareness into how my brain works. It's like I can stand back and watch how my brain processes things. My subconscious is pulled into full view and I can almost tinker with it, in a way. Some theories suggest that the colors and geometric patterns people tend to see is our actual brain operations being leaked into the visual cortex and/or the fractal patterns are the result of actual "data loops" caused by psychedelics allowing information to pass around freely. (Not my theories, btw.)

Now, you may read this and think: That dude is just tripping! (And you would be very much correct.) The thing is, every experience I have and anything that I feel is already in my brain. The data is already there, but how that data is processed is vastly different. Even if I am perceiving parts of my brain that I couldn't before, it's still just the same neurons that were always there.

So, what I am basically saying is, is that my self-awareness temporarily becomes self-aware. It's a shitty description, but it's the closest I could get to matching the situation you were asking about.

I still develop memories, almost as normal. Some memories stick and some fade. All that really happened is that a few neuron weights got shaken up and it all becomes a few new pathways with a similar number of neurons as before. (Neurogenesis and dendritic growth as a result of psychedelics is a different subject and I wouldn't think it would be part of the recursion-type situation you are asking about.)

Memories become integrated with existing ones, basically. While vastly different in many ways, our current AI tech has a set number of "neurons" it can work with in any given model. You can train those same bunches of neurons with millions of different patterns, but the actual size of the neural network doesn't change, no matter how much data you cram into it. With slight changes to when neurons fire, you are using specific pathways of neurons to save data, not necessarily using the neurons themselves.

[–] possiblylinux127@lemmy.zip 1 points 4 months ago (1 children)

Drugs can be very dangerous and lead to false perspectives. Anyway I was referring purely to the amount of information in your brain. To understand it you would need to double your brain.

[–] remotelove@lemmy.ca 1 points 4 months ago* (last edited 4 months ago)

False perspectives, sure. That is a possibility. False or not, that reasoning forms memories which must be stored somewhere. Those memories are what you might consider "new data" that comes from "new connections".

Understanding something is based off what you already know or just learned, which is memory. Logic and reasoning is partially instinctual and mostly memory. Decision making is likely not strictly based in memory, but more based on memories.

I think where you might be getting something mixed up is undersanding vs. memorizing and how the brain stores information. If I am understanding you correctly, you are thinking of data like a computer handles data: A zero or a one is a bit. Eight bits are in a byte. PC memory can hold X bytes until it gets full, and then game over.

Our brains simply don't work like a PC and we naturally store patterns, not specific raw data.

So, if a neuron has 3 inputs and 3 outputs, it has 6 connections to other neurons. With a computer, you need to lay out a few arrays and map each connection to each other. If we had a mess of neurons on a table in front of us to stick together, we would just need to remember to connect outputs to inputs and follow any other simple conditions we are given without strictly needing to memorize a 1:1 connection map.

Pattern matching is core functionality for our brains. So much so, we actively seek out patterns whenever we can. (Reference: apophenia and pareidolia)

For things like simple number memorization and even speech, our brains are able to do those things based on a series of different patterns that we stick together automatically. By doing so, we can use and reuse groups of the same neurons for seemingly different tasks. Its essentially built-in data compression, for lack of a better term.

If we were to ignore real constraints like the time it would take to map out all of the connections in our brain, we would naturally start to store patterns of neuron connections, patterns of how those neuron clumps interact and other key features of operation.

My reasoning is that we would start to get extraordinary "compression rates" for this data. If we memorized a complex pattern of 100 neurons, we could likely reuse that "data" a million times over for other major components of the brain.

By the very nature of how we store information, data loss will happen and it's unavoidable. We would need a new form of brain that was capable of storing blocks of data 1:1.

Also, your question is also a paradox. if we were to say we a brain would need to be double the size to store itself, then you would need a brain four times the size to store both of those brains. A brain storing itself is still a single brain so this turns into a recursion nightmare fairly quick.

If we doubled the size of our brains, we could probably do some amazing things and our memories might become phenomenal. However, the way we learn and store information is basically the same and probably still wouldn't allow us to exactly store a million prime numbers in order.

The summary of all of this is that you aren't accounting for memory patterns and natural data loss and there are very simple reasons that a brain doesn't "fill up".

Edit: Psychedelics are not inherently dangerous. Neither is THC or many other compounds. Mixing drugs and/or improper dosages is what is actually dangerous. There are probably more legal drugs that are riskier than illegal ones, actually. I would consider alcohol to be a substance that carries more risks, mainly because it is legal almost everywhere.