Roberta's avatar

Roberta

@jardinera.bsky.social

Just read a 2024 article “Chat GTP is Bullshit” by Hicks, Humphries and Slater, Ethics Inf Technol where they argue it’s more accurate to call the inaccuracies in LLM as bullshit instead of ‘hallucinations’, in science communication. Interesting discussion, referencing other studies.

1 replies 0 reposts 4 likes


Andy Craig's avatar Andy Craig @andycraig.bsky.social
[ View ]

I agree "hallucination" is too generous. It implies there's some fault to be fixed, some deviation from normal. When a person hallucinates, it's a failure of an ability humans otherwise have. The "On Bullshit" definition does fit better, it means no awareness of true or false, even to get it wrong.

1 replies 0 reposts 3 likes