this post was submitted on 03 Mar 2024
43 points (69.4% liked)

Technology

59436 readers
3000 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 8 comments
sorted by: hot top controversial new old
[–] DarkNightoftheSoul@mander.xyz 64 points 8 months ago

Am I reading this right- they don't have a finished product or a business model? This reads like a press release from the "pump" stage of a "pump'n'dump."

How do I block specific websites?

[–] kromem@lemmy.world 38 points 8 months ago

For anyone interested in algorithmic changes that improve efficiency, Microsoft's recent research around moving from floating point weights to ternary ones (1, 0, -1) was really impressive:

https://arxiv.org/abs/2402.17764

Basically at larger parameter sizes it outperforms FP networks while being a fraction of the memory footprint and bypassing the need for matrix multiplication.

It kind of makes sense that it works too, given past research that the networks are creating a virtualized node topology based on combinations of physical nodes, so with enough nodes to work with there isn't a loss in functionality and the discrete weights should arrive at optimal thresholds more easily than slight adjustments to FP values.

The next generation of models built on this need to be trained from scratch (this is about pretraining and not quantization after the fact), but it should open the door to new hardware architectures better optimized for networks of ternary weights.

[–] Lettuceeatlettuce@lemmy.ml 26 points 8 months ago (1 children)

Same pattern as crypto. Hype the tech, spend millions on developing chips that can only do one specific thing, deploy them in a virtual gold rush, eventually the bubble pops and the last fools are left holding the bags trying to offload stacks of ASICs that are worthless.

Billions wasted trying to capitalize on "AI" that will largely cause more harm than good.

[–] mesamunefire@lemmy.world 4 points 8 months ago (1 children)

Crypto did have the asci miners (I think I have that right?) that were much much faster than GPU after the initial push. I would expect the ai craze to do the same.

[–] TimeSquirrel@kbin.social 1 points 8 months ago

If the human brain can do what it does while being powered by Doritos and beer, we need to do wayyy better on the hardware efficiency front.

[–] WallEx@feddit.de 21 points 8 months ago

They don't even know what the company will be doing, but hyping it up. From what I'm reading its just a vision, but gets presented like an actual product "coming to a device near you sooner as you think"... Clickbaity as hell

[–] Municipal0379@lemmy.world 12 points 8 months ago

I want to make schnozberries a real thing. Are we just listing things we want now?

[–] mlg@lemmy.world 6 points 8 months ago

So like a TPU?