Oh really? Let's calculate the power usage or billions of devices downloading and serving ads.
Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
How much would we save if we'd somehow be able to debloat and deshittify the Internet and all devices? Climate impact, overconsumption of unnecessary crap, mental health care...
Well, since when Silicon Valley cared about environment or Global warming? It's all about $ always.
An interesting topic but the article has virtually no information on it and what was there was unsourced and confusing. Maybe I'm just tired and not seeing it but damn, the taking 50 Belgiuns to the moon comparison really got me confused. I agree in general though, new technologies take energy and we need to decarbonise our energy generation as quickly as possible.
I'd actually be really interested in an actual deep dive into this topic though. What kind of tasks are people using these assistants for and how does energy use of an assistant compare with how people would do that before? I'm sure it's more energy intensive but it'd be interesting to understand more at least for me.
I agree that the article is a bit confusing, but we can't keep increasing energy consumption and hope decarbonization will fix it.
From an environmental point of view energy is never free. Also as long as we still use fossil fuels, any new usage of renewable (e.g. run AI on solar panels) is energy that could have been use to replace fossil usage.
Energy consumption has a dark underbelly of rare earth mineral consumption that is often just swept under the rug of shiny new thing. Ooh.
What do you propose, exactly? We have the technology right now to decarbonise our grid, it's even the sensible move economically now. Are you saying we should all stop having kids and building anything new that uses electricity? I'm assuming that's not your position but that's what I took from reading your comment.
Simply be mindful of our energy usage, and not just rely on decarbonization. We need both because decarbonization will not happen overnight.
Historically, worldwide our production of renewable have kept growing, the percentage have been growing, but fossil fuels usage have also kept growing.
Now we get a new technology that is using even more energy, maybe we should work on energy efficiency and use that tech sparingly instead of building more data centers so incels can get their voice chat AI girlfriend, and say "we'll just install more solar panels and windmills".
Sure but... what do you propose? Saying be mindful of our energy use isn't actionable. Are you saying we should cap energy use and have a bidding system for industries who want to use new capacity, have a carbon price so industries are encouraged to use non carbon producing energy? I still don't understand what you're suggesting. Or maybe if you think entertainment is a waste of energy we should ban non educational use of video on the internet as I'm sure that is an insane amount of energy use worldwide.
How much energy do you think gets used by a computer that costs $700,000 per day to run?
https://earth.org/environmental-impact-chatgpt/
https://www.digitaltrends.com/computing/chatgpt-cost-to-operate/
I have no idea, that's kind of my point. I'm not trying to argue that it's not much, or that it's a lot, or that it's worth it or not, just saying I have no idea and neither that article nor any of the ones you linked gave me the answer.
I think it's an important consideration, so I'd love more information but it seems that it's not available. Maybe it's hard to calculate because things like the energy used and exact amount of compute are trade secrets or something, I don't know. It'd be nice to know though.
From earth.org: “Data centres are typically considered black box compared to other industries reporting their carbon footprint; thus, while researchers have estimated emissions, there is no explicit figure documenting the total power used by ChatGPT. The rapid growth of the AI sector combined with limited transparency means that the total electricity use and carbon emissions attributed to AI are unknown, and major cloud providers are not providing the necessary information.”
How does it compare to crypto?
I don't really care about other commenters saying that the article doesn't have a reliable enough source. I know that commercial LLMs are terrible resource consumers and since I don't support their development I think they should be legally banned for this very reason.
That is a very valid and reasonable opinion, sorry to see it downvoted.
There will be strong disagreement with you, however, on the case that LLMs are a big enough resource hog to require outright banning for just that reason.
If you are looking for Big Tech hit boxes, try for things like writing laws that require all energy consumption in datacenters to be monitered and reported using established cross-disciplinary methods.
Or getting people to stop buying phones every year. Or banning disposable vapes.
I knew it's going to be downvoted. People here mostly support AI. But I don't and what I meant is that I just would love the governments to ban it (obviously). The energy efficiency is the most simple reason to tell them so yea. Sorry everyone but I'm old schooled. Put your fancy AI bells and whistles away and embrace efficient, old and proven ways of computing such as using GUI, TTY and search engines (that still consume a lot but not as inefficiently). They at least don't consume 10 MW (or a few seconds of full load CPU time and 200Gb of space if it's a local LLM) to calculate 2+2*2 or give you a link to a Wikipedia article that explains what a helicopter is (cough cough Bing cough cough). And they hallucinate way less often too.
permacompute or bust
Who is permacompute?
This thing is utopian and everything but man is it the best thing I read online in 2024 so far. Permacomputing or bust.
FoxtrotRomeo (edit: for real )
SierraUniformSierra
I am at a loss, you might have to explain that one to me - all I’ve got is either something was sus or Save Union Souls lol
Sus
SNAFU, it’s all sus anymore
Worst take in the entire community right now.
I wonder how they measured this. Could it just be that they get more utilisation? Even per capita is probably not adequate either. You would need a measure that's an analogue of per capita. Maybe per result? For instance I could spend half an hour attempting to get just the right set of keywords to bring up the right result, or I could spend 5 minutes in a chat session with an AI honing the correct response.
The wording of the article implies an apples to apples comparison. So 1 Google search == 1 question successfully answered by an LLM. Remember a Google Search in layspeak is not the act of clicking on the search button, rather it's the act of going to Google to find a website that has information you want. The equivalent with ChatGPT would be to start a "conversation" and getting information you want on a particular topic.
How many search engine queries, or LLM prompts that involves, or how broad the topic, is a level of technical detail that one assumes the source for the number x25 has already controlled for (Feel free to ask the author for the source and share with us though!)
Anyone who's remotely used any kind of deep learning will know right away that deep learning uses an order of magnitude or two more power (and an order of magnitude or two more performance!) compared to algorithmic and rules based software, and a number like x25 for a similar effective outcome would not at all be surprising, if the approach used is unnecessarily complex.
For example, I could write a neural network to compute 2+2, or I could use an arithmetic calculator. One requires a 500$ GPU consuming 300 watts, the other a 2$ pocket calculator running on 5 watts, returning the answer before the neural network is even done booting.
However many years it takes for these LLM fools to wake up, hopefully they can find a way to laugh at themselves for thinking that it was cutting-edge to jam the internet into a fake jellyfish brain and calling it GPT. I haven’t looked recently, but I still haven’t seen anyone talking about neuroglial networks and how they will revolutionize the applications for AI.
There’s a big*** book, but apparently no public takers in the deep neural network space?
Might be correct but without any source for the number I can't even share this back. Asked them, will update if I get an answer.
Doesn't matter if its geothermal or some renewable
It does because that energy could've been used for other purposes. This is the Jevons paradox https://en.wikipedia.org/wiki/Jevons_paradox
At this point I basically need to do 25 Google searches to find what I’m looking for anyway. This is a stupid comparison. When I eat cabbage and beer my digestive tract releases more GHGs than my whole day of using ChatGPT (zero). I just need to figure out how to harvest and burn my own methane so I can do more ChatGPT queries guilt-free.