

No effort could have evacuated the entire population of Gaza without free movement across land borders. It was never a practical option.
Even if it had been, parents making a dumbass decision doesn’t justify killing their kids.
No effort could have evacuated the entire population of Gaza without free movement across land borders. It was never a practical option.
Even if it had been, parents making a dumbass decision doesn’t justify killing their kids.
For those curious, the characters are katakana (the syllabary often used in Japan for foreign words, onomatopoeia, etc) and they’d be read as “ma-ri-u-su”, which is possibly intended to represent “Marius” under Japanese spelling conventions.
Betteridge strikes again.
A chatbot is a tool, nothing more. Responsibility, in this case, falls on the people who deployed a tool that wasn’t fit for purpose (in this case, the sympathetic human conversational partner that the AI was supposed to mimic would have done anything but what it did—even changing the subject or spouting total gibberish would have been better than encouraging this kid). So OpenAI is indeed responsible and hopefully will end up with their pants sued off.
If it were an enforced cap, I’d be wondering who it was that had the shares in a desktop/laptop/“real computer” manufacturer—forbidding smartphones ≠ forbidding screens or Internet access.
Whether or not that’s a defense depends on the details of the French legal system. In most countries, there are rights you’re not allowed to sign away. No idea whether security of the person is one of those rights in France.
Thing is, to the people who don’t follow tech news and aren’t really interested in this stuff, AI = AGI. It’s like most non-scientists equating “theory” and “hypothesis”. So it’s a really bad choice of term that’s interfering with communication.
I wouldn’t put it past them.
The result would have been the same if there had been a human behind the catfishing instead of an LLM, and events could have played out in a similar fashion if they’d been snail mail pen pals. The outcome of this story is tragic, but it doesn’t have much to do with technology when you stop and think about it for a moment.
There will be a lot of cases where the data gets accidentally encapsulated (then fragmented due to incompatible protocols) in a cat.