Tica the Sloth's avatar

Tica the Sloth

@ticasloth.bsky.social

They don't learn exactly like a human, but they do learn. They understand things. People keep moving the goalposts of what "understanding" means so they won't have to grapple with this. LLMs are not conscious, but they are intelligent.

4 replies 0 reposts 0 likes


Odie's avatar Odie @theodeity.bsky.social
[ View ]

people keep moving the goalposts in the other direction to make it seem more advanced than it is. Yeah. If I give a program a set of words for colors, and ask it to give me a color, it will return a word from that set, and that is TECHNICALLY understanding. But does it know the concept of color?NO

2 replies 0 reposts 6 likes


Kingfisher & Wombat's avatar Kingfisher & Wombat @tkingfisher.com
[ View ]

But they DON’T understand. The LLM has no idea what red is, just how often that combination of letters is statistically likely to appear. It does not comprehend redness on any level except as a math equation.

2 replies 1 reposts 71 likes


Jonathan's avatar Jonathan @jginsburg.bsky.social
[ View ]

They really don't, they are a statistical frequency analysis engine, they take the input, compare different outputs it has to similar inputs, then amalgamates the outputs into a single output based on pre-determined math. It doesn't understand the input or output, just the statistics

0 replies 0 reposts 0 likes