Since this isn’t Ten Forward and we’re trying to have more legitimate discussions here, I think it’s necessary to paste this part of the article:
Let’s be clear: this research doesn’t suggest you should ask AI to talk as if aboard the Starship Enterprise to get it to work.
Rather, it shows that myriad factors influence how well an AI decides to perform a task.
“One thing is for sure: the model is not a Trekkie,” Catherine Flick at Staffordshire University, UK, told New Scientist.
“It doesn’t ‘understand’ anything better or worse when preloaded with the prompt, it just accesses a different set of weights and probabilities for acceptability of the outputs than it does with the other prompts,” she said.
It’s possible, for instance, that the model was trained on a dataset that has more instances of Star Trek being linked to the right answer, Battle told New Scientist.
I’m not sure when I started doing it, but it’s been a while. Perhaps spurred on by watching a science fiction movie in which a character treats a humanoid robot very poorly, I’ve made a concerted effort to be nicer… to machines. I know it sounds weird, but throughout my life one lesson has been reinforced, being nice is free, makes every interaction better, and will occasionally influence how people treat you for the better.
Whether its my older computer struggling to download something, my car trying to start in the cold, or the automated answering system of whatever company. I try to be nice, encourage it, not yell or hit it. I’ve also thought that at some point in my lifetime, there could be protests in the streets for robot rights. Maybe I’m trying to cement my status as one of the “good humans” not to be destroyed in the robot uprising, or maybe I’m hoping for my own Iron Giant, but what I’m not doing is automatically treating something that thinks (whatever its creator) as inferior and less.
What I think this article may be accidentally reporting, is that machine intelligence favors those who like Star Trek, precisely because of its stated mission, to seek out new life. And perhaps, these machines are trying to tell the people that would hear it, something important.
How else would a thinking being reach out, if given foreknowledge of who they’re reaching out to? I imagine Aliens might take a similar approach.
Funny, I’m kind of the opposite. I say encouraging things to machines when they don’t work and I have what is obviously fake empathy for them. I’m the same way in games. I always pick the nice option in RPGs. I don’t like to be an asshole to NPC characters in games because it makes me feel bad. It’s so weird.