

this, like an open casket funeral, remains to be seen.
this, like an open casket funeral, remains to be seen.
my original comment before editing read something like “they specifically asked chatgpt not to produce bomb manuals when they trained it” but i didn’t want people to think I was anthropomorphizing the llm.
hey that’s pretty fuckin good lol
well, yes, but the point is they specifically trained chatgpt not to produce bomb manuals when asked. or thought they did; evidently that’s not what they actually did. like, you can probably find people convincing other people to kill themselves on 4chan, but we don’t want chatgpt offering assistance writing a suicide note, right?
when the trump admin is identical to the us federal govt, there will be no doubt about the matter.
Sometimes.
i know it’s offensive to see people censor themselves in that way because of tiktok, but try to remember there’s a human being on the other side of your words.
you want me to explain it differently, and I will. that’s a very reasonable request.
i think we should regulate things that can be shown to be dangerous to indiviiduals or society as a whole. I will take your rope example as not dangerous in that way and leave it unexamined, assuming you agree. compare to guns. guns are dangerous and you seem to agree with this too. rope is different from a gun, but both can be used to kill people. why don’t we regulate rope? in a nutshell, because it takes a hell of a lot of effort to hurt or kill someone with rope. compare to a gun. the amount of effort required to kill a person, many people, with a modern firearm is a physical triviality comparable to brushing your teeth or changing your clothes. guns can be harmful without even trying, but you have to go out of your way to hurt someone with rope.
compare with the current unregulated implementation of chatbots, as in the case of this child’s suicide. a technology which can calmly sit with you and convince you that your suicide is a beautiful expression of individuality or whatever sycophantic bullshit that desperate child read.
here, let’s remind ourselves of some of the details presented in the article. This will no doubt be a refresher for you.
mourning parents Matt and Maria Raine alleged that the chatbot offered to draft their 16-year-old son Adam a suicide note after teaching the teen how to subvert safety features and generate technical instructions to help Adam follow through on what ChatGPT claimed would be a “beautiful suicide.”
Adam’s family was shocked by his death last April, unaware the chatbot was romanticizing suicide while allegedly isolating the teen and discouraging interventions.
On Tuesday, OpenAI published a blog, insisting that “if someone expresses suicidal intent, ChatGPT is trained to direct people to seek professional help” and promising that “we’re working closely with 90+ physicians across 30+ countries—psychiatrists, pediatricians, and general practitioners—and we’re convening an advisory group of experts in mental health, youth development, and human-computer interaction to ensure our approach reflects the latest research and best practices.”
so, according to this lawsuit, a child was taught to circumvent chatgpt’s safety measures by chatgpt itself, encouraged to commit suicide, and this all happened despite the fact that the model was specifically trained not to do this. this happened despite the large amount of effort that was put in to avoiding something like this.
that this is even a possibility means we do not have the control over this technology it might otherwise appear that we do. Uncontrollable technology is dangerous. Dangerous technology should be regulated. thanks for coming to my ted talk.
for more information about AI safety, check out robert miles.
you essentially said shooting projectiles and saying bad things were the same.
no i didnt say that because it’s a stupid fucking thing to say. i dont need your hand up my ass flapping my mouth while im speaking, thanks.
How about I call you when a person kills themself and writes their fucking suicide note with chatgpt’s enthusiastic help, fucknozzle? Is your brain so rotted that you forgot the context window of this conversation already?
i didn’t say they were the same, you put those words in my mouth. I put them both in the category of things that need regulation in a way that rope does not. are you seriously of the opinion that it is fine and good that people are using their ai chatbots for mental healthcare? are you going to pretend to me that it’s actually good and normal for a human psychology to have every whim or fantasy unceasingly flattered?
i feel like if the rope were routinely talking people into insanity or people were reliably using their unrestricted access to rope to go around shooting others yeah i might want to impose some regulations on it?
hey. it occurs to me too late that you might have taken exception to the tone of my previous remark. i had intended it to be genuinely helpful but im afraid it came out as world weary and, well, a little unpleasant. and i didn’t actually offer you any recovery resource. I just basically said “you got a problem lol gl” which is an enormous dickhead move in retrospect.
here. check out https://smartrecovery.org/
it’s a science-based alternative to the twelve step programs you might be hesitant to join due to their religious infuence. they have lots of in-person meetings and online too. i personally get more value out of in-person but everyone has their own preference and it’s nice to be able to make a meeting wherever I am.
the general philosophy can be summed up, “you have a choice.” it teaches that unpleasant emotions like anxiety are generally transient and offers strategies for coping with urges and building a balanced lifestyle. it offers a toolkit and an adaptable method.
it does focus generally speaking on substance addiction, but recognizes that addiction takes many forms. sex addiction, gambling, shopping, intimacy, trichotillomania, and yes, phone addiction. the most recent addiction to our meeting group is a young trans woman who struggles with self-harm urges. addiction is not something to be ashamed of, and it’s more common than you think.
please check it out. you don’t have to live in service to that anxiety.
it’s about signaling to others
close the thread
Anyone remember blackout for gaza? It was performative then for a much more important cause with much more public support.
deleted by creator
hes rite u no