this post was submitted on 19 Jan 2024
253 points (95.3% liked)

Technology

59080 readers
4344 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Summary: Meta, led by CEO Mark Zuckerberg, is investing billions in Nvidia's H100 graphics cards to build a massive compute infrastructure for AI research and projects. By end of 2024, Meta aims to have 350,000 of these GPUs, with total expenditures potentially reaching $9 billion. This move is part of Meta's focus on developing artificial general intelligence (AGI), competing with firms like OpenAI and Google's DeepMind. The company's AI and computing investments are a key part of its 2024 budget, emphasizing AI as their largest investment area.

you are viewing a single comment's thread
view the rest of the comments
[–] qupada@kbin.social 17 points 9 months ago

The estimated training time for GPT-4 is 90 days though.

Assuming you could scale that linearly with the amount of hardware, you'd get it down to about 3.5 days. From four times a year to twice a week.

If you're scrambling to get ahead of the competition, being able to iterate that quickly could very much be worth the money.