Character AI has filed a motion to dismiss a case brought against it by the parent of a teen who committed suicide allegedly after becoming hooked on the company's technology.
Why would you keep it from saying that when in certain contexts that’s perfectly acceptable? I explained exactly that point in another post.
This is sort of a tangent in this case because what the AI said was very oblique—exactly the sort of thing it would be impossible to guard against. It said something like “come home to me,” which would be patently ridiculous to censor against, and impossible to anticipate that this would be the reaction to that phrase.
Why would you keep it from saying that when in certain contexts that’s perfectly acceptable? I explained exactly that point in another post.
This is sort of a tangent in this case because what the AI said was very oblique—exactly the sort of thing it would be impossible to guard against. It said something like “come home to me,” which would be patently ridiculous to censor against, and impossible to anticipate that this would be the reaction to that phrase.