this post was submitted on 31 Jul 2024
34 points (100.0% liked)

Technology

37708 readers
387 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

So we'll all have our own familiar soon?

top 25 comments
sorted by: hot top controversial new old
[–] istanbullu@lemmy.ml 27 points 3 months ago (2 children)

I hate Meta and never use their products. But I have to give them credit for their support of open-source ML: first pytorch, then llama.

[–] MalReynolds 27 points 3 months ago (1 children)

Ditto on the hate, technical, but important distinction here, they support open-weight ML. They do not release training source code or data sets to actually make your own (granted you'd need millions in video cards to do it, but still). Open-source gets thrown around a lot in AI, presumably virtue signalling, but precious few walk the walk.

Never underestimate the value of getting hordes of unpaid workers to refine your product. (See also React, others)

[–] istanbullu@lemmy.ml 6 points 3 months ago (1 children)

I understand the distinction, but it's still waaay better than what ~~OpenIAI~~ClosedAI is doing.

Also people are really good at reverse engineering. Open weights models can be fine tuned or adapted. I am trained a Llama 3 Lora not that long ago.

[–] MalReynolds 1 points 3 months ago (1 children)

Agreed, and the chance of it backfiring on them is indeed pleasingly high. If the compute moat for initial training gets lower (e.g. trinary/binary models) or distributed training (Hivemind etc) takes off, or both, or something new, all bets are off.

[–] istanbullu@lemmy.ml 1 points 3 months ago

The compute moat for the initial training will never get lower. But as the foundation models get better, the need for from-scratch training will be less frequent.

[–] yogsototh@programming.dev 8 points 3 months ago

I think unlike Google, there are still many pure engineers that need to contribute to open source to be motivated and are still have some power.

I feel, but I am not sure, that for Google, thing have switched more and faster to the side of Big soulless corps.

Generally speaking my experience is that even in these big soulless corps there are positive and passionate people. But quite often they do not have enough decision power to have a positive impact.

[–] sxan@midwest.social 24 points 3 months ago (1 children)

The best proof of advancements in the field of AI is Zuckerberg himself. He looks more and more like a real human every time I see a new picture of him.

[–] averyminya@beehaw.org 8 points 3 months ago (1 children)

They learned that hair makes them look human

[–] taanegl@beehaw.org 6 points 3 months ago (1 children)

They spent a lot of GPU time trying to get it just right.

[–] averyminya@beehaw.org 1 points 3 months ago

Crypto currency proof of work was actually entirely for every strand of hair on the Zuck

[–] MalReynolds 18 points 3 months ago

Please let local, open equivalents be available (see LocalAI for an example of not being far off) before this. The sheer scale of data harvesting this will enable boggles the mind.

[–] t3rmit3@beehaw.org 12 points 3 months ago

I can write a perfect chatbot representation of myself in just 3 lines of python, no AI needed

while True:
    input()
    print('Screw Mark Zuckerberg')
[–] scrubbles@poptalk.scrubbles.tech 9 points 3 months ago (2 children)

Write, just like the Metaverse, Facebook is betting on something no one really asked for. When really we just want the old Facebook back

[–] sabreW4K3@lazysoci.al 10 points 3 months ago (1 children)

I definitely don't want the old or any Facebook back.

[–] scrubbles@poptalk.scrubbles.tech 4 points 3 months ago (2 children)

I would, before there were ads, the share button, the actual feed of just friends, that was fun. They'll never go back, but it was fun for a few months

[–] pbjamm@beehaw.org 4 points 3 months ago

Bring back Google+ !

[–] cheers_queers@lemm.ee 1 points 3 months ago

i miss pieces of flair 😭

[–] Corgana@startrek.website 4 points 3 months ago (2 children)

I'm surprised there's no fedi version of Facebook but I'm also sure as the fediverse evolves we'll see the return of personal websites but with activitypub-based social features.

[–] halm@leminal.space 10 points 3 months ago

no fedi version of Facebook

Friendica? Hubzilla? The former in particular felt very much like a FB-alike when I tried it several years ago.

[–] Deceptichum@quokk.au 5 points 3 months ago

Zuckerberg really took all the robot memes to heart didn’t he?

[–] ArmokGoB@lemmy.dbzer0.com 4 points 3 months ago

I can't wait to have a Destiny 2 ghost IRL

[–] leisesprecher@feddit.org 3 points 3 months ago

I'll call mine Guillermo.

[–] Midnitte@beehaw.org 2 points 3 months ago

Isn't the future of work exciting?

[–] rothaine@beehaw.org 1 points 3 months ago