Err, I’d describe your anecdote more as an attempt to reason with it…? If you were using google to search for an answer to something and it came up with the wrong thing, you wouldn’t then complain back to it about it being wrong, you’d just try again with different terms or move on to something else. If ‘using’ it for you is scolding it as if it’s an incompetent coworker, then maybe the problem isn’t the tool but how you’re trying to use it.
I wasn’t aware the purpose of this joke meme thread was to act as a policy workshop to determine an actionable media campaign
Lmao, it certainly isn’t. Then again, had you been responding with any discernible humor of your own I might not have had reason to take your comment seriously.
And yes, I very intentionally used the phrase ‘understand how computers actually work’ to infantilize and demean corporate executives.
Except your original comment wasn’t directed at corporate executives, it appears to be more of a personal review of the tool itself. Unless your boss was the one asking you to use Gemini? Either way, that phrase is used so much more often as self-aggrandizement and condescension that it’s hard to see it as anything else, especially when it follows an anecdote of that person trying to reason with a piece of software lmao.
It is not that it responded “Sorry, I cannot find anything like what you described, here are some things that are pretty close.”
It affirmatively said “No, no such things as you describe exist, here are some things that are pretty close.”
There’s a huge difference between a coworker saying “Dang man, I dunno, I can’t find a thing like that.” and “No, nothing like that exists, closest to it is x y z,”
The former is honest. The latter is confidently incorrect.
Combine that with “Wait what about gamma?”
And the former is still honest, and the latter, who now describes gamma in great detail and how it meets my requirements, is now an obvious liar, after telling me that nothing like that exists.
If I now know I am dealing with a dishonest interlocutor, now I am forced to consider tricking it into being homest.
Or, if I am less informed or more naive, I might just, you know, believe it the first time.
A standard search engine that is not formatted to resemble talking to a person does not prompt a user to expect it to act like a person, and thus does not suffer from this problem.
If you don’t find what you’re looking for, all that means is you did not find it.
If you are told that no such thing exists, a lot of people are going to believe that no such thing exists.
That is typically called spreading disinformation, when the actor knows what they are claiming is false.
Its worse than unhelpful, it actively spreads lies.
…
Anyway, I’m sorry that you don’t see humor in multi billion dollar technology failing at achieving its purported abilities, I laugh all the time at poorly designed products, systems, things.
…
Finally, I did not use the phrase in contention in my original post.
I used it in my response to you, specifically and only within a single sentence which revolved around incompetent executives.
…
It appears that reading comprehension is not your strong suit, maybe you can ask Gemini about how to improve it.
Lmao, there should also be an automod rule for this phrase, too.
There’s a huge difference between a coworker saying […]
Lol, you’re still talking about it like it’s a person that can be reasoned with bud. It’s just a piece of software. If it doesn’t give you the response you want you can try using a different prompt, just like if google doesn’t find what you’re looking for you can change your search terms.
If people are gullible enough to take its responses as given (or scold it for not being capable of rational thought lmao) then that’s their problem - just like how people can take the first search result from google without scrutiny if they want to, too. There’s nothing especially problematic about the existence of an AI chatbot that hasn’t been addressed with the advent of every other information technology.
Err, I’d describe your anecdote more as an attempt to reason with it…? If you were using google to search for an answer to something and it came up with the wrong thing, you wouldn’t then complain back to it about it being wrong, you’d just try again with different terms or move on to something else. If ‘using’ it for you is scolding it as if it’s an incompetent coworker, then maybe the problem isn’t the tool but how you’re trying to use it.
Lmao, it certainly isn’t. Then again, had you been responding with any discernible humor of your own I might not have had reason to take your comment seriously.
Except your original comment wasn’t directed at corporate executives, it appears to be more of a personal review of the tool itself. Unless your boss was the one asking you to use Gemini? Either way, that phrase is used so much more often as self-aggrandizement and condescension that it’s hard to see it as anything else, especially when it follows an anecdote of that person trying to reason with a piece of software lmao.
It is not that it responded “Sorry, I cannot find anything like what you described, here are some things that are pretty close.”
It affirmatively said “No, no such things as you describe exist, here are some things that are pretty close.”
There’s a huge difference between a coworker saying “Dang man, I dunno, I can’t find a thing like that.” and “No, nothing like that exists, closest to it is x y z,”
The former is honest. The latter is confidently incorrect.
Combine that with “Wait what about gamma?”
And the former is still honest, and the latter, who now describes gamma in great detail and how it meets my requirements, is now an obvious liar, after telling me that nothing like that exists.
If I now know I am dealing with a dishonest interlocutor, now I am forced to consider tricking it into being homest.
Or, if I am less informed or more naive, I might just, you know, believe it the first time.
A standard search engine that is not formatted to resemble talking to a person does not prompt a user to expect it to act like a person, and thus does not suffer from this problem.
If you don’t find what you’re looking for, all that means is you did not find it.
If you are told that no such thing exists, a lot of people are going to believe that no such thing exists.
That is typically called spreading disinformation, when the actor knows what they are claiming is false.
Its worse than unhelpful, it actively spreads lies.
…
Anyway, I’m sorry that you don’t see humor in multi billion dollar technology failing at achieving its purported abilities, I laugh all the time at poorly designed products, systems, things.
…
Finally, I did not use the phrase in contention in my original post.
I used it in my response to you, specifically and only within a single sentence which revolved around incompetent executives.
…
It appears that reading comprehension is not your strong suit, maybe you can ask Gemini about how to improve it.
Err, well, maybe don’t do that.
Lmao, there should also be an automod rule for this phrase, too.
Lol, you’re still talking about it like it’s a person that can be reasoned with bud. It’s just a piece of software. If it doesn’t give you the response you want you can try using a different prompt, just like if google doesn’t find what you’re looking for you can change your search terms.
If people are gullible enough to take its responses as given (or scold it for not being capable of rational thought lmao) then that’s their problem - just like how people can take the first search result from google without scrutiny if they want to, too. There’s nothing especially problematic about the existence of an AI chatbot that hasn’t been addressed with the advent of every other information technology.