this post was submitted on 15 Apr 2024
418 points (93.2% liked)

Solarpunk

5413 readers
20 users here now

The space to discuss Solarpunk itself and Solarpunk related stuff that doesn't fit elsewhere.

What is Solarpunk?

Join our chat: Movim or XMPP client.

founded 2 years ago
MODERATORS
 

I found that idea interesting. Will we consider it the norm in the future to have a "firewall" layer between news and ourselves?

I once wrote a short story where the protagonist was receiving news of the death of a friend but it was intercepted by its AI assistant that said "when you will have time, there is an emotional news that does not require urgent action that you will need to digest". I feel it could become the norm.

EDIT: For context, Karpathy is a very famous deep learning researcher who just came back from a 2-weeks break from internet. I think he does not talks about politics there but it applies quite a bit.

EDIT2: I find it interesting that many reactions here are (IMO) missing the point. This is not about shielding one from information that one may be uncomfortable with but with tweets especially designed to elicit reactions, which is kind of becoming a plague on twitter due to their new incentives. It is to make the difference between presenting news in a neutral way and as "incredibly atrocious crime done to CHILDREN and you are a monster for not caring!". The second one does feel a lot like exploit of emotional backdoors in my opinion.

top 50 comments
sorted by: hot top controversial new old
[–] Alexstarfire@lemmy.world 52 points 6 months ago (7 children)

Yea, no thanks. I don't want things filtered based on what someone else thinks I should see.

[–] NounsAndWords@lemmy.world 6 points 6 months ago (2 children)

What if it's based on what you think you should see?

[–] Worx@lemmynsfw.com 11 points 6 months ago (3 children)

Either it's you deciding as you see it (ie there is no filter), or it's past you who's deciding in which case it's a different person. I've grown mentally and emotionally as I've got older and I certainly don't want me-from-10-years-ago to be in control of what me-right-now is even allowed to see

load more comments (3 replies)
[–] neuracnu@lemmy.blahaj.zone 3 points 6 months ago

Just like diet, some people prefer balancing food types and practicing moderation, and others overindulge on what makes them feel good in the moment.

Having food options tightly controlled would restrict personal liberty, but doing nothing and letting people choose will lead to bad outcomes.

The solution is to educate people on what kinds of choices are healthy and what are not, financially subsidize the healthy options so they are within reach to all, and only use law to restrict things that are explicitly harmful.

Mapping that back to news and media, I’d like to see public education promoting the value of a balanced media and news diet. Put more money into non-politically-aligned news organizations. Look closely at news orgs that knowingly peddle falsehoods and either bring libel charges against them or create new laws that address the public harm done by maliciously spreading misinformation.

But I’m no lawyer, so I don’t know how to do that last part without creating some form of tyranny.

[–] halfway_neko@lemmy.dbzer0.com 6 points 6 months ago (2 children)

isn't that what the upvote/downvote buttons are for? although to be fair, i'd much rather the people of lemmy decide which things are good and interesting than some "algorithm"

[–] fine_sandy_bottom@discuss.tchncs.de 4 points 6 months ago (2 children)

There's a real risk to this belief.

There are elements of lemmy who use votes to manipulate which ideas appear popular, with the intention of manipulating discourse rather than having open discussions.

load more comments (2 replies)
load more comments (1 replies)
load more comments (5 replies)
[–] cro_magnon_gilf@sopuli.xyz 42 points 6 months ago (1 children)

That's why I stick with platforms where hardline communist teenagers can curate what I'm exposed to.

[–] keepthepace 7 points 6 months ago

That's the only way.

[–] GrymEdm@lemmy.world 29 points 6 months ago* (last edited 6 months ago) (2 children)

Without wanting to be too aggressive, with only that quote to go on it sounds like that person wants to live in a safe zone where they're never challenged, angered, made afraid, or have to reconsider their world view. That's the very definition of an echo chamber. I don't think you're meant to live life experiencing only "approved" moments, even if you're the one in charge of approving them. Frankly I don't know how that would be possible without an insane amount of external control. You'd have to have someone/something else as a "wall" of sorts controlling your every experience or else how would things get reliably filtered?

I'd much prefer to teach people how to be resilient so they don't have to be afraid of being exposed to the "wrong" ideas. I'd recommend things like learning what emotions mean and how to deal with them, coping/processing bad moments, introspection, how to get help, and how to check new ideas against your own ethics. E.g. if you read something and it makes you angry, what idea/experience is the anger telling you to protect yourself from and how does it match your morality? How do you express that anger in a reasonable and productive way? If it's serious who do you call? And so on.

[–] OKRainbowKid@feddit.de 6 points 6 months ago (2 children)

I see where you're coming from, but if you look up Karpathy, you'll probably come to a different conclusion.

load more comments (2 replies)
[–] keepthepace 5 points 6 months ago (2 children)

I think you are getting it wrong. I added a small edit for context. It is more about emotional distraction. I kinda feel like him: I want to remain informed, but please let me prepare a bit before telling me about civilians cut in pieces in a conflict between a funny cat video and a machine learning news.

For the same reason we filter out porn or gore images from our feeds, highly emotional news should be filterable

[–] GrymEdm@lemmy.world 3 points 6 months ago* (last edited 6 months ago) (1 children)

I don't think there's anything wrong with taking a break from social media or news. There are days I don't visit sites like Lemmy or when I do I only click non-news links because I'm not in the mood or already having a bad day. That's different than filtering (as per Karpathy's example) Tweets so that when you do engage it's consistently a very curated, inoffensive, "safe" experience. Again, I only have the one post to go off of, but he specifically talks about wishing to avoid Tweets that "elicit emotions" or "nudge views" and compares those provocative messages to malware. As far as your point regarding blatantly sensationalist news, when I recognize it's that kind of story I just stop reading/watching and that's that.

I WANT to have my emotions elicited because I seek to be educated and don't want to be complacent about things that should make me react. "Don't know, don't care" is how people go unrepresented or abused - e.g. almost no one reads about what Boko Haram is doing in Nigeria (thus it's already "filtered out" by media), and so very little has been done in the 22 years they've been affecting millions of lives. I WANT to have my "views nudged" because I'm regularly checking my worldview to make sure it stays centered around my core ethics, and being challenged has prompted me to change bad stances before. Being exposed to objectionable content before and reassessing is also how I've learned to spot BS attempts to manipulate. It doesn't matter how many times MAGA Tweets tell me that God is upset at drag queens and only Donald Trump can save the world because now I recognize ragebait when I see it. Having dealt with it before, no amount of exposure is going to make me believe their trash and knowing what is being said is useful for exposing and opposing harmful governmental policies/bad candidates (sometimes even helping deprogram others).

load more comments (1 replies)
load more comments (1 replies)
[–] dejected_warp_core@lemmy.world 19 points 6 months ago (4 children)

The real question then becomes: what would you trust to filter comments and information for you?

In the past, it was newspaper editors, TV news teams, journalists, and so on. Assuming we can't have a return to form on that front, would it be down to some AI?

[–] keepthepace 8 points 6 months ago (2 children)

Why do people, especially here in the fediverse, immediately assume that the only way to do it is to give power of censorship to a third party?

Just have an optional, automatic, user-parameterized, auto-tagger and set parameters yourself for what you want to see.

Have a list of things that should receive trigger warnings. Group things by anger-inducing factors.

I'd love to have a way to filter things out by actionnable items: things I can get angry about but that I have little ways of changing, no need to give me more than a monthly update on.

load more comments (2 replies)
[–] SatansMaggotyCumFart@lemmy.world 8 points 6 months ago (1 children)

My mom, she always wants the best for me.

[–] dejected_warp_core@lemmy.world 5 points 6 months ago

Easily better than all the other options.

load more comments (2 replies)
[–] MonkderDritte@feddit.de 19 points 6 months ago* (last edited 6 months ago) (3 children)

Our mind is built on that "malware". I think it's more accurate to compare brain + knowledge to our immune system: the more samples you have, the better you are armed against mal-information.

[–] EntirelyUnlovable@lemmy.world 3 points 6 months ago

I was thinking the same, you need to be exposed to some bullshit every now and then to give contrast and context to what you believe to be true

[–] FooBarrington@lemmy.world 3 points 6 months ago (2 children)

But that leaves out the psychological effects of long-term exposure to ideas. If you know for a fact that the earth is round, and for the next 50 years all the media you consume keeps telling you that the earth is flat, you will at some point start believing that (or at least become unsure).

Every piece of information you receive has some tiny effect on you.

load more comments (2 replies)
[–] keepthepace 3 points 6 months ago

This sounds like the theories that were more prevalent before germ theory. Surgeons or obstetricians would argue that washing hands was a disservice to the organisms they get into.

Immune systems still get sick and can be overwhelmed. There is a mental hygiene that needs to exist.

[–] Lemvi@lemmy.sdf.org 13 points 6 months ago* (last edited 6 months ago) (4 children)

I think the right approach would be to learn to deal with any kind of information, rather than to censor anything we might not like hearing.

load more comments (4 replies)
[–] PoliticalAgitator@lemmy.world 12 points 6 months ago (1 children)
[–] keepthepace 6 points 6 months ago* (last edited 6 months ago)

I really think that as the 20th century saw the rise of basic hygiene practices we are putting in place mental hygiene practices in the 21st.

[–] Whorehoarder@lemmynsfw.com 12 points 6 months ago (1 children)

Reminds me of Snow Crash by Nealyboi

[–] perestroika 9 points 6 months ago* (last edited 6 months ago)

I think most people already have this firewall installed, and it's working too well - they're absorbing minimal information that contradicts their self-image or world view. :) Scammers just know how to bypass the firewall. :)

[–] nutt_goblin@lemmy.world 9 points 6 months ago (1 children)

Sounds like we're reinventing forum moderation and media literacy from first principles here.

load more comments (1 replies)
[–] fine_sandy_bottom@discuss.tchncs.de 9 points 6 months ago (1 children)

Not really. An executable controlled by an attacker could likely "own" you. A toot tweet or comment can not, it's just an idea or thought that you can accept or reject.

We already distance ourselves from sources of always bad ideas. For example, we're all here instead of on truth social.

[–] trashgirlfriend@lemmy.world 5 points 6 months ago

Jokes on you, all of my posts are infohazards that make you breathe manually when you read them.

[–] YoFrodo@lemmy.world 8 points 6 months ago

Reading, watching, and listening to anything is like this. You accept communications into your brain and sort it out there. It's why people censor things, to shield others and/or to prevent the spread of certain ideas/concepts/information.

Misinformation, lies, scams, etc function entirely on exploiting it

[–] rtxn@lemmy.world 8 points 6 months ago* (last edited 6 months ago) (11 children)

Nah man, curl that shit into my bash and let me deal with it

load more comments (11 replies)
[–] bloodfart@lemmy.ml 8 points 6 months ago

We already have a firewall layer between outside information and ourselves, it’s called the ego, superego, our morals, ethics and comprehension of our membership in groups, our existing views and values. The sum of our experiences up till now!

Lay off the Stephenson and Gibson. Try some Tolstoy or Steinbeck.

[–] xxd@discuss.tchncs.de 5 points 6 months ago (2 children)

Leaving aside the dystopian echo chamber that this could result in, you could argue that this would help with fake news by a lot. Fake news are so easy to spread and more present than ever. And for every person there is probably that one piece of news that is just believable enough to not question it. And then the next just believable piece of news. and another. I believe no one is immune to being influenced by fake stories, maybe even radicalized if they are targeted just right. A firewall just filtering out everything non-factual would already prevent so much societal damage I think.

load more comments (2 replies)
[–] Belastend@lemmy.world 5 points 6 months ago (1 children)

Hüman brain just liek PC, me so smort.

[–] Gradually_Adjusting@lemmy.world 7 points 6 months ago (2 children)

It's definitely an angle worth considering when we talk about how the weakest link in any security system is its human users. We're not just "not immune" to propaganda, we're ideological petri dishes filled with second-hand agar agar.

load more comments (2 replies)
[–] Fizz@lemmy.nz 5 points 6 months ago (1 children)

We already have a firewall its our thoughts. The information can nudge us but it's fighting an uphill battle against everything we already know and believe.

load more comments (1 replies)
[–] theneverfox@pawb.social 5 points 6 months ago

I remember watching a video from a psychiatrist with eastern Monk training. He was explaining about why yogis spend decades meditating in remote caves - he said it was to control information/stimuli exposure.

Ideas are like seeds, once they take root they grow. You can weed out unwanted ones, but it takes time and mental energy. It pulls at your attention and keeps you from functioning at your best

The concept really spoke to me. It's easier to consciously control your environment than it is to consciously control your thoughts and emotions.

[–] x_cell 5 points 6 months ago

In a way, the job of a teacher or journalist is to filter useful and/or relevant information for interested parties.

[–] Anticorp@lemmy.world 4 points 6 months ago (1 children)

You are responsible for what you do with the information you process. You're not supposed to just believe everything you read, or let it affect you. We don't need some government or organization deciding what can be shown online. Read history and see what follows mass censorship.

[–] keepthepace 6 points 6 months ago (3 children)

I am bewildered that so many people contrive this as suggesting it should be a government or a company deciding what to show you. Obviously any kind of firewall/filter ought to be optional and user controlled!

load more comments (3 replies)
[–] Kolanaki@yiffit.net 4 points 6 months ago

I've thought about this since seeing Ghost in the Shell as a kid. If direct neural interfaces become common place, the threat of hacking opens up from simply stealing financial information or material for blackmail; they may be able to control your entire body!

[–] EntirelyUnlovable@lemmy.world 3 points 6 months ago* (last edited 6 months ago)

I wonder if maybe it's more apt a comparison to say that allowing raw comments to affect you in a strong way is like running a random program as root. To a certain extent you have to let this kind of harmful content in.

P.s. the short story sounds cool - is it available to read anywhere?

[–] DaseinPickle@leminal.space 3 points 6 months ago (1 children)

I mean, this is just called censorship. We censor things for kids and all kind of people in or lives all the time. We censor things for ourselves when we don’t feel like reading the news or opening a text from a specific person. This is not some novel concept.

[–] keepthepace 4 points 6 months ago (1 children)

Not really. This is user-controlled filtering. Censorship is done to push a specific worldview to victims. Filtering we do it all the time for spam for instance.

[–] GrymEdm@lemmy.world 3 points 6 months ago* (last edited 6 months ago)

But the post is explicitly about Tweets that challenge emotions and views and how that's harmful. It's one thing to want to see fewer suspicious offers from Nigerian princes and horny MILFS in my area. It's another to tell an AI that you don't want to see events or conversations that might be upsetting or make you think about ethics, politics, etc.

P.S. I'm replying to you a lot today, just want to say I'm not trying to be abusive or follow you around. You keep making points on this page that I want to engage with, and hopefully it's not coming across as persecution.

[–] Sagar@sopuli.xyz 3 points 6 months ago

Yes, lemmy too is that. We need to meet people and then form groups online. I had devised a solution for exchanging public keys in person and verifying each content thereafter with that key.

load more comments
view more: next ›