this post was submitted on 15 Sep 2024
650 points (97.7% liked)

Technology

59341 readers
6114 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmy.ml/post/20289663

A report from Morgan Stanley suggests the datacenter industry is on track to emit 2.5 billion tons by 2030, which is three times higher than the predictions if generative AI had not come into play.

The extra demand from GenAI will reportedly lead to a rise in emissions from 200 million tons this year to 600 million tons by 2030, thanks largely to the construction of more data centers to keep up with the demand for cloud services.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 35 points 2 months ago

I remember when scientists were more focused on making AI models smaller and more efficient, and research on generative models was focused on making GANs as robust as possible with very little compute and data.

Now that big companies and rich investors saw the potential for profit in AI the paradigm has shifted to "throw more compute at the wall until something sticks", so it's not surprising it's affecting carbon emissions.

Besides that it's also annoying that most of the time they keep their AIs behind closed doors, and even in the few cases where the weights are released publicly these models are so big that they aren't usable for the vast majority of people, as sometimes even Kaggle can't handle them.