

I think most people would argue 1-3% of datacenter use is still a significant global pollution factor that is a problem.
I code and do art things. Check https://cloudy.horse64.org/ for the person behind this content. For my projects, https://codeberg.org/ell1e has many of them.
I think most people would argue 1-3% of datacenter use is still a significant global pollution factor that is a problem.
Often it is respected, but the resulting problem is platforms conflate things with the questionable AI scraping crawlers to blackmail websites into participating in feeding AI.
For example, Googlebot if enabled won’t just list you for search, but will also scrape your contents for Google’s AI. Edit: see https://arstechnica.com/tech-policy/2025/07/cloudflare-wants-google-to-change-its-ai-search-crawling-google-likely-wont/ as source. I imagine LinkedinBot, given it’s microsoft, will feed some other AI of theirs as well on top of the previews.
Until regulation steps in to require AI bots to separately ask for crawling permission, or to actually get a proper license for reuse of the contents, this situation isn’t going to improve.
Many of us dislike all the things you listed for their impact, including AI.