PeterBronez

joined 2 years ago
[–] PeterBronez@hachyderm.io 1 points 3 months ago

@kent_eh exactly.

The alternative is “if you want your content to be private, share it privately.”

If you transmit your content to anyone who sends you a GET request, you lose control of that content. The recipient has the bits.

It would be nice to extend the core technology to better reflect your intent. Perhaps embedding license metadata in the images, the way LICENSE.txt travels with source code. That’s still quite weak, as we saw with Do Not Track.

[–] PeterBronez@hachyderm.io 2 points 3 months ago (2 children)

@along_the_road what’s the alternative scenario here?

You could push to remove some public information from common crawl. How do you identify what public data is _unintentionally_ public?

Assume we solve that problem. Now the open datasets and models developed on them are weaker. They’re specifically weaker at identifying children as things that exist in the world. Do we want that? What if it reduces the performance of cars’ emergency breaking systems? CSAM filters? Family photo organization?

[–] PeterBronez@hachyderm.io 7 points 3 months ago (4 children)

@along_the_road

“These were mostly family photos uploaded to personal and parenting blogs […] as well as stills from YouTube videos"

So… people posted photos of their kids on public websites, common crawl scraped them, LAION-5B cleaned it up for training, and now there are models. This doesn’t seem evil to me… digital commons working as intended.

If anyone is surprised, the fault lies with the UX around “private URL” sharing, not devs using Common Crawl

#commoncrawl #AI #laiondatabase

[–] PeterBronez@hachyderm.io 1 points 1 year ago (1 children)

@abaci @mojo 100% worth it for me.

I switched from Google to #DDG to #Kagi. Even if the results were exactly the same, Kagi is FAST. The whole experience is very snappy.

Beyond that:

- All your DDG bangs work, and you can add custom bangs

- They have some neat AI summarization features.

- You can manually boost/penalize/block domains

- Lenses focus your search on particular kinds of sites

[–] PeterBronez@hachyderm.io 3 points 1 year ago (1 children)

@Nougat @trashhalo @technology this is literally why short-term disability insurance exists.

[–] PeterBronez@hachyderm.io 9 points 1 year ago

@esaru @bmaxv @technology concur that this reduces privacy for users of Jitsi’s hosted service. It also has some concrete benefits for Jitsi - they get to outsource account validation and security. Perhaps they were struggling to contain abuse.

[–] PeterBronez@hachyderm.io 1 points 1 year ago

@Unsustainable @bananahammock @technology

A “Matrix Bridge” is a computer program that connects to an arbitrary service and presents it as Matrix service. You can connect to that Matrix service with any Matrix client.

For example, this code connects LinkedIn messages to Matrix: https://github.com/beeper/linkedin

Beeper runs Matrix bridges for you as a service. If you don’t want to use that service, you can self-host the bridges.

[–] PeterBronez@hachyderm.io 1 points 1 year ago (1 children)

@Unsustainable @bananahammock @technology Matrix is a protocol for real time communication. Several companies build products using this protocol, including Elemental, Beeper and Rocket Chat.

This is similar to how ActivityPub is a protocol for federated social media. Many projects are built using ActivityPub, including Mastodon, PixelFed, and Lemmy.

https://en.wikipedia.org/wiki/Matrix_(protocol)

[–] PeterBronez@hachyderm.io 3 points 1 year ago (1 children)

@donuts

For example, image search has been contentious for very similar reasons.

  1. You post a picture online for people to see, and host some adds to make some money when people look at it.
  2. Then Google starts showing the picture in image search results.
  3. People view the image on Google and never visit your site or click on your ads. Worst case, google hot links it and you incur increased hosting costs with zero extra ad revenue

@throws_lemy @SSUPII @technology

[–] PeterBronez@hachyderm.io 2 points 1 year ago

@donuts

I certainly think that a Generative AI model is a more significant harm to the artist, because it impacts future, novel work in addition to already-published work.

However in both cases the key issue is a lack of clear & enforceable licensing on the published image. We retreat to asking “is this fair use?” and watching for new Library of Congress guidance. We should do better.

@throws_lemy @SSUPII @technology

[–] PeterBronez@hachyderm.io 1 points 1 year ago (3 children)

@donuts would you please share your thinking?

I certainly agree that you can see the current wave of Generative AI development as “scraping and stealing people’s art.” But it’s not clear to me why crawling the web and publishing the work as a model is more problematic than publishing crawl results through a search engine.

@throws_lemy @SSUPII @technology

[–] PeterBronez@hachyderm.io 3 points 1 year ago

@SSUPII @DigitalAudio the most useful resource I’ve come across is this Philosophy Tube video: https://youtu.be/AITRzvm0Xtg

view more: next ›