Forwarded from Chat GPT
Man investigates ChatGPT’s “it’s complex” lie
Usually used either when ChatGPT is lying by that it knows something when it doesn’t, or lying that it doesn’t know a simple answer when it does, but can’t say it for political correctness reasons.
Man finds, as we already knew — it’s bs-business-speak / woke-speak, prevalent on quora and linkedin.
But why did ChatGPT pick it up?
Because it’s the most prevalent language on the web? No, recent quora and linkedin posts are a very tiny fraction of the training set.
No, ChatGPT picked up this scammy lying phrase for the exact SAME REASON that quora and linkedin picked it up.
The reason?
= Because the human raters love this scammy phrase.
In the case of Linkedin and Quora, those human raters are post likers and voters. In the case of ChatGPT, those human raters are RLHF raters.
— Exact same type of thing.
Humans voting stupidly, when rating what’s good or not.
OpenAI, please stop hiring human raters so prone to falling for BS phrases like that.
Article
Usually used either when ChatGPT is lying by that it knows something when it doesn’t, or lying that it doesn’t know a simple answer when it does, but can’t say it for political correctness reasons.
Man finds, as we already knew — it’s bs-business-speak / woke-speak, prevalent on quora and linkedin.
But why did ChatGPT pick it up?
Because it’s the most prevalent language on the web? No, recent quora and linkedin posts are a very tiny fraction of the training set.
No, ChatGPT picked up this scammy lying phrase for the exact SAME REASON that quora and linkedin picked it up.
The reason?
= Because the human raters love this scammy phrase.
In the case of Linkedin and Quora, those human raters are post likers and voters. In the case of ChatGPT, those human raters are RLHF raters.
— Exact same type of thing.
Humans voting stupidly, when rating what’s good or not.
OpenAI, please stop hiring human raters so prone to falling for BS phrases like that.
Article