SolarSailer

joined 1 year ago
[–] SolarSailer@beehaw.org 5 points 1 year ago

I mean, you can already download models that have been fine tuned to fix issues with hands... but still it's great that they continue to improve the model!

[–] SolarSailer@beehaw.org 2 points 1 year ago (2 children)

I would recommend that you check out Golden Sun.

[–] SolarSailer@beehaw.org 1 points 1 year ago

I agree with the hesitancy on WD. They're also starting to automatically flag NAS drives older than 3 years with a warning flag. Plus when they shipped out SMR drives as Red drives a few years back... https://youtu.be/cLGi8sPLkLY

[–] SolarSailer@beehaw.org 1 points 1 year ago

Agreed, a few states (4-5?) have been able to fully eliminate the biggest issue with civil asset forfeiture (including removing the loophole that allows their state employees from working with the federal govt. to get around those restrictions).

Over half the states have started to do something about it. But as long as that loophole remains and as long as the federal govt. itself can continue to abuse civil asset forfeiture, we'll continue to see problems.

[–] SolarSailer@beehaw.org 1 points 1 year ago* (last edited 1 year ago) (1 children)

I was curious as well as to the definition. So I looked up the published opinion. You can find it on the official website: https://www.supremecourt.gov/opinions/slipopinion/22 Look for "Twitter, Inc. v. Taamneh" Or a direct link to it is here: https://www.supremecourt.gov/opinions/22pdf/21-1496_d18f.pdf

Basically it looks like most of the case was working around figuring out the definition of "Aiding and Abetting" and how it applied to Facebook, Twitter, and Google. It's worth reading, or at least skipping to the end where they summarize it.

When they analyzed the algorithm they found that:

As presented here, the algorithms appear agnostic as to the nature of the content, matching any content (including ISIS’ content) with any user who is more likely to view that content. The fact that these algorithms matched some ISIS content with some users thus does not convert defendants’ passive assistance into active abetting.

The only way I could see them liable for the algorithm is if any big tech company had tweaked the algorithm so that it specifically recommended the terrorist content more than it should have.

The code doesn't have a concept of what is right or what is wrong, it doesn't even understand what the content is that it's recommending. It just sees that users watching this video also typically watch that other video and so it recommends that.

if you took out algorithm and put in employee and it would be that, then slotting in algorithm should not be a defense.

Alright let me try a hypothetical here. Let's say I hosted a public billboard in a town square and I used some open source code to program a robot to automatically pick up fliers from a bin that anyone could submit fliers to. People can tag the top part of their flier with a specific color. The robot has an algorithm that reads the color and then puts up the fliers on a certain day of the week corresponding with that color.
If someone slipped some terrorist propoganda into the bin, who is at fault for the robots actions?

Should the developer who published the open source code be held liable for the robots actions?

Should the the person that hosts the billboard be liable for the robots actions?

Edit: fixed a grammatical error and added suggestion to read summary.

[–] SolarSailer@beehaw.org 4 points 1 year ago (2 children)

Alright, I'm still new to all this federated stuff, but I heard there was a way that users could post across platforms including places like Mastodon.

So if I wanted to "like" or comment on that awesome video with this account, how would I do that?

[–] SolarSailer@beehaw.org 1 points 1 year ago (4 children)

Ah, I see what you're getting at.

Maybe that was critical to the Supreme Court case, but it wasn’t presented in the news that way (that I saw)

Yeah that's the problem with a lot of news organizations. They like to spin stories to support whatever agenda/narrative they want to push rather than what the case was actually about.

I would suggest this video by Steve Lehto: https://youtu.be/2EzX_RdpJlY
He's a lawyer who mostly comments on legal issues that end up in the news and his insight is invaluable. He talks about these 2 cases specifically in this video.

#2 was very specific towards whether you would be considered as aiding and abetting terrorists in a terror attack if the algorithm pushed it to others.

It sounds like there are a ton of other cases that have been submitted to the supreme court, so I'm sure there's one that may address your concerns.

And frankly, I’m tired of Big Tech getting a pass on problems they have / create “because it’s too hard” to be responsible where no one else gets the “too hard” defense.

I get your frustration, I mean I'm assuming that most everyone that's here is here because we're fed up with what they've done with social media.
But in this case a loss for big tech would have had even worse repercussions for smaller platforms like Lemmy.

[–] SolarSailer@beehaw.org 2 points 1 year ago (6 children)

That's fine, but let's dig into it a bit more.

Where do you draw the line for what's considered "terrorist content" vs what is just promoting things that terrorists also promote.

And how do you implement a way to fix the algorithm so that absolutely nothing that crosses that line?

Just look at how well email filters work against junk mail and scams.

Now let's apply this to Lemmy and federated instances. If you personally are hosting an instance, of course you're going to do your best to keep it free from content like that. Let's say you're running some open source code that has an algorithm for highlighting posts that align with the user's previously liked content.
If someone posts something that crosses the line and it gets around your filters and somehow gets highlighted to other users before you can remove it, you are suggesting that the person in charge of that instance should be directly held responsible and criminally charged for aiding and abetting terrorism.

 

An excellent decision. If it had gone the other way we likely would have seen social media websites shutdown entirely and comments disabled from YouTube. This also would have directly affected anyone in the U.S. that wanted to run an instance of Lemmy (or any federated instance that users could post content on).

The rulings were in regards to Section 230 which was a law passed in 1996 aimed at protecting services which allow users to post their own content.

The supreme court tackled 2 different cases concerning this:

  1. Whether social media platforms can be held liable for what their users have said.
  2. This was very specific to whether algorithms that refer tailored content to individual users can cause companies to be considered as knowingly aiding and abetting terrorists (if their pro-terrorist content is referred to other users).
[–] SolarSailer@beehaw.org 3 points 1 year ago

Not only that. It seems like the article completely ignores how batteries can be recycled and assumes that every new battery undergoes the same manufacturing process.

[–] SolarSailer@beehaw.org 3 points 1 year ago

This may be true in Europe and in cities for those who both live and work in the city. But for the vast majority in the U.S. It's practically required to have a car if you want to work.

[–] SolarSailer@beehaw.org 2 points 1 year ago (1 children)

I understand what he's suggesting and I do agree that we need to fix up our town planning.

And that's why my point wasn't that he's wrong about his suggestions, just that, again, it's "much easier said than done."

For the foreseeable future, owning a car is the only reasonable way of getting around many parts of the U.S.

How long do you think it would take to fix up even half of the cities in the U.S?

How can we fast track it and what are reasonable expectations since there will be pushback from people?

In a way we would need some sort of Haussmannization to occur and that will not go well in the U.S.

[–] SolarSailer@beehaw.org 6 points 1 year ago (3 children)

This is much easier said than done. Around large parts of the United States you can't reliably commute by public transit. For me personally, without a car, a one way 40 mile trip to the major city near me would take 5 hours. That's 2 different trains and 2 different busses.

Add that to the fact that the station closest to me only has a few trains a day and my options are very limited.

Even if we ignore the current train schedule and assume that trains come by every 5 min, it would still be a 2 hour trip that costs me $20 for one way. I could then bike the rest of the way and avoid the last 2 buses.

There are rail passes I could get, but those would cost $477/month. It's cheaper to lease a Tesla at that point.

Owning a car is pretty much the only reasonable way of getting around for many parts of the U.S.

 

This specific case revolved around a 94 yr old woman in Minnesota who owed $15k in taxes. Her home was taken away and sold for $40k and the government kept everything from the auction.

It's hard to believe that this was ever legal, this is a major win for common sense rulings.

 

Have you ever wondered why officers like to ask how much cash someone has that they have pulled over?

Probably one of the worst laws that is easily abused (as written) in the U.S. is Civil Asset Forfeiture. A simple way to describe it is legalized robbery by law enforcement. This is one of the ways that the law essentially treats the victim (property owner) as guilty before a trial has even begun. Law enforcement does not even need to charge you with a crime to take your money.

In order to get your money back the property owner would have to sue the federal/police organization. This means spending your own time and money to fight in court to prove your innocence (rather than law enforcement having to prove you guilty).

In most cases the property owner will spend more money fighting in court than they would get back if they won their court case. In other cases, law enforcement will generally offer a small percentage of the money back.

I get that these can be useful laws when actually taking down criminal organizations, but we really need to fix the laws so that law enforcement only gets the money/assets when the offending party is actually guilty of a crime.

As of 2021, only Maine, Nebraska, New Mexico, and North Carolina have completely overhauled their laws to require that prosecutors prove the owner's guilt. 36 States (and the District of Colombia) have taken steps to scale back these forfeiture laws, however, the vast majority of them have a major loophole in which law enforcement can partner with federal departments (such as the U.S. Justice and Treasury) and split the earnings from a forfeiture. https://www.usatoday.com/story/news/nation/2021/08/19/states-work-scale-back-civil-forfeiture-laws-amid-federal-loophole/8181774002/

A little over a week ago a jury rejected a truckers claim to money that had been seized (even though no criminal charges have been brought against him). They took $40k that he had gathered together and was on his way to buy a truck.

https://landline.media/texas-jury-rejects-truckers-claim-to-seized-money/

Additional Sources: https://ij.org/report/policing-for-profit-first-edition/part-i-policing-for-profit/

John Oliver also did a video on this 8 years ago: https://youtu.be/3kEpZWGgJks

view more: next ›