

I’ve heard it likened to the dot-com boom: yeah, we’ve got a tonne of e-commerce today, but the stars hadn’t aligned in early 2k.
Seems a bit early tbh. But I’ll take it.
In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity, the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini.
It would be fantastic if our other GHG-producing activities were held to the same level of criticism as AI.
You’re gonna get downvotes defending AI on Lemmy - our Overton window is *tiny*.
A ChatGPT prompt uses 3 Wh. This is enough energy to:
Leave a single incandescent light bulb on for 3 minutes.
Leave a wireless router on for 30 minutes.
Play a gaming console for 1 minute.
Run a vacuum cleaner for 10 seconds.
Run a microwave for 10 seconds
Run a toaster for 8 seconds
Brew coffee for 10 seconds
Use a laptop for 3 minutes. ChatGPT could write this post using less energy than your laptop uses over the time you read it.
Presumably it would evaluate claims in the text without the user having to do the search. Sounds cool to me.