|
Cats, Sims & Books@catssimsbooks.bsky.social |
I have only one disagreement. The researchers call LLMs soft bullshit, because there is no intent behind the bullshit. But I think you have to look beyond that, at the intent of those who created the LLMs. To them getting any human like output was more important than getting accurate output.
0 replies 0 reposts 0 likes