Calculating water/energy usage for "AI" per token is a bit problematic: A data center has a massive base load even if nobody uses it just by sheer existence. And since we have no actual data for any of the popular platforms all numbers floating around are problematic and not very useful. Like how much power does one of those servers+NVIDIA cards really save if its utilization is only 50%? And are the overhead costs actually counted?
This essay by @Baldur Bjarnason on why individual experiments on the usefulness of "AI" (or similar stuff) don't teach us anything useful and might actually harm us is brilliant. Go read it. Too many insights to pull a quote TBH:
"The cult of goal-setting thrives in this illusion. It converts uncertainty into an illusion of progress. It demands specificity in exchange for comfort. And it replaces self-trust with the performance of future-planning." (Original title: Smart People Don't Chase Goals; They Create Limits) https://www.joanwestenberg.com/smart-people-dont-chase-goals-they-create-limits/
"The real threat posed by generative AI is not that it will eliminate work on a mass scale, rendering human labour obsolete. It is that, left unchecked, it will continue to transform work in ways that deepen precarity, intensify surveillance, and widen existing inequalities." "The current trajectory of generative AI reflects the priorities of firms seeking to lower costs, discipline workers, and consolidate profits — not any drive to enhance human flourishing. If we allow this trajectory to go unchallenged, we should not be surprised when the gains from technological innovation accrue to the few, while the burdens fall upon the many."