• 0 Posts
  • 13 Comments
Joined 2 years ago
cake
Cake day: August 27th, 2023

help-circle
  • I can only comment on the behavior I see. This is an online forum, I don’t have a choice but to assume.

    Regardless, most are very vocal about AI being theft, a line of thinking that directly benefits the copyright lobby and big AI. Big AI doesn’t mind paying for the data if it gives them a monopoly.

    The moment a chatbot does something mildly worrisome, like help draft a suicide letter, the conversation is filled with people calling for censorship, protection and regulation. Again, something that would directly benefit big AI.

    I’m also assuming they are against both the copyright industry and AI in general, just that most people seem to say things that help the copyright lobby and big AI without knowing it.



  • What you’re doing is falling for propaganda from a long ass time ago by the owner class…

    Or using the actual current definition of the word. It’s like going on a rant about hunters when you get called a nimrod.

    I’m also going to push back on pretending the current anti-ai movement is against capitalism when it’s pro copyright. Their support is what big AI companies are using to create their monopoly.

    This centuries luddites aren’t tearing down machinery but helping build a walled garden.



  • I put them both in the category of things that need regulation in a way that rope does not

    My whole point since the beginning is that this is dumb, hence my comment when you essentially said shooting projectiles and saying bad things were the same. Call me when someone shoots up a school with AI. Guns and AI are clearly not in the same category.

    And yes, I think people should be able to talk to their chatbot about their issues and problems. It’s not a good idea to treat it as a therapist but it’s a free country. The only solution would be massive censorship and banning local open source AI, when it’s very censored already (hence the need for jailbreak to have it say anything sexual, violent or on the subject of suicide).

    Think for a second about what you are asking and what it implies.



  • The difference is that guns were built to hurt and kill things. That is literally the only thing they are good for.

    AI has thousands of different uses (cue the idiots telling me its useless). Comparing them to guns is basically rhetoric.

    Do you want to ban rope because you can hang yourself with it? If someone uses a hammer to kill, are you going to throw red paint at hammer defenders? Maybe we should ban discord or even lemmy, I imagine quite a few people get encouraged to kill themselves on communication platforms. A real solution would be to ban the word “suicide” from the internet. This all sounds silly but it’s the same energy as your statement.





  • I did some quick math with metas llama model and the training cost was about a flight to Europe worth of energy, not a lot when you take in the amount of people that use it compared to the flight.

    Whatever you’re imagining as the impact, it’s probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.