Tica the Sloth's avatar

Tica the Sloth

@ticasloth.bsky.social

It seems to me that human brains also use language by using "statistical word association." What the AI lacks is lived and sensory experience. It's hard to tell fact from fiction using language alone.

13 replies 1 reposts 8 likes


jadehawk's avatar jadehawk @jadehawk.bsky.social
[ View ]

no. human brains connect signifiers to signifieds. LLM meanwhile aren't programmed to even have either of these concepts

0 replies 0 reposts 2 likes


Dan's avatar Dan @dandwiggins.bsky.social
[ View ]

That’s how we learn. It’s not how we produce output. It’s how LLMs do both.

0 replies 0 reposts 3 likes


Taka Hanazawa's avatar Taka Hanazawa @takahanazawa.bsky.social
[ View ]

This has been a long-standing argument about AI systems is that they lack intentionality and that the only difference between us and them is a lack of intentionality but I don't think that that is correct

0 replies 0 reposts 0 likes


LCatala's avatar LCatala @lcatala.bsky.social
[ View ]

LLMs work by pattern matching, predicting what comes next, but doing that to language *destroys meaning*. If you can always predict the next word, that means the next word isn't actually adding any new information. Meaning is on a tight rope between order and chaos, but LLMs look only for order.

0 replies 0 reposts 2 likes


Audi HD's avatar Audi HD @kh0rish.bsky.social
[ View ]

Wtf does that even mean. We have intent, and pick words to communicate that intent. It has nothing to do with statistics. There are detectable regularities because we create and use structure, but we that is NOT how we function or how we create.

2 replies 0 reposts 59 likes


's avatar @rhymeswithmoose.bsky.social
[ View ]

And how much time have you spent studying subjects like cognition and language acquisition?

1 replies 0 reposts 0 likes


Netmouse's avatar Netmouse @netmouse.bsky.social
[ View ]

We have some effects in brain development that use statistical word (/sound/sensation/sight/ and especially smell) association, but they are strongly affected by hormones and feelings, which help determine what is important to remember, and when/how to recall something.

0 replies 0 reposts 1 likes


Kim Wincen's avatar Kim Wincen @wincenworks.bsky.social
[ View ]

This operates on the assumption the default state of a human is a highly analytical adult, which is simply not true. The default state of a human is a baby who may never learn language if they are not actively taught piece by piece. LLMs learn by mass analysis, not incremental analysis.

0 replies 0 reposts 1 likes


Kathryn Tewson's avatar Kathryn Tewson @kathryntewson.bsky.social
[ View ]

it lacks reasoning capacity or a mind. It does not understand things. it just arranges text.

1 replies 0 reposts 29 likes


Señora Luna (she/ella)'s avatar Señora Luna (she/ella) @suyang.bsky.social
[ View ]

That is one small component of how human brains learn/use language. Do you have any background in language acquisition? This is an extremely complex topic and if it boiled down to statistical word association far more people would be multilingual.

0 replies 0 reposts 2 likes


Daniel Goldman's avatar Daniel Goldman @dgoldman.bsky.social
[ View ]

Yep. That's what conditioning is about. It's shaping the basic patterns. Given a large base model you probably don't even need to give it and kind of motivation function. But what's really needed is the self directed fine tuning and internal dialogue (i.e. thinking)

0 replies 0 reposts 2 likes


Andy Craig's avatar Andy Craig @andycraig.bsky.social
[ View ]

Human brains do use language that way, it's why language in and of itself is much more statistically predictable than people realize when using it. But we use language to convey underlying thoughts and ideas. Without that, it's just a hollow wrapper.

0 replies 0 reposts 22 likes


Elizabeth Sandifer's avatar Elizabeth Sandifer @elsandifer.bsky.social
[ View ]

Yes and no. The thing is that they *only* do statistical word association. What they lack isn’t just sensory experience. They have no framework for thinking about things as objects. And in terms of their design, that’s not a “yet”—it’s a problem we don’t even know how to start solving.

2 replies 1 reposts 9 likes