Phrases like “it symbolizes the increasing role” are something I generally see ChatGPT say. People don’t typically talk like that, even pretentious lunatics on LinkedIn.
For me it was the fact that the first sentence literally just spells out what the line above says. It feels like every other sentence coming from ChatGPT is just a summary of the previous sentence. (Unless it’s trying to relativize, then it hits you with the “it’s important to remember”.)
This will keep happening as long as humans keep ranking wordy AIs higher than succinct ones. Unfortunately we have this gut instinct to judge long responses as more true than short ones, so we keep making the problem worse.
Unfortunately I have to use this cesspool a lot for work.
In any case, something I’ve noticed is the ‘contribute your thoughts to this topic (for ai)’ are always responded to by people who are clearly using chatgpt.
That was absolutely written by AI.
Phrases like “it symbolizes the increasing role” are something I generally see ChatGPT say. People don’t typically talk like that, even pretentious lunatics on LinkedIn.
For me it was the fact that the first sentence literally just spells out what the line above says. It feels like every other sentence coming from ChatGPT is just a summary of the previous sentence. (Unless it’s trying to relativize, then it hits you with the “it’s important to remember”.)
This will keep happening as long as humans keep ranking wordy AIs higher than succinct ones. Unfortunately we have this gut instinct to judge long responses as more true than short ones, so we keep making the problem worse.
Unfortunately I have to use this cesspool a lot for work.
In any case, something I’ve noticed is the ‘contribute your thoughts to this topic (for ai)’ are always responded to by people who are clearly using chatgpt.
It’s just bots talking to bots all way down.