Dr. Damien P. Williams: Magus, Werewolf, Cyborg, Bi's avatar

Dr. Damien P. Williams: Magus, Werewolf, Cyborg, Bi

@wolvendamien.bsky.social

You're not. As long as they're based on GPT-type architectures, they will have problems of this type, because this is how the Generative Pretrained Transformer system "Works"
www.americanscientist.org/article/bias...
youtu.be/9DpM_TXq2ws

2 replies 6 reposts 18 likes


Dan Turner's avatar Dan Turner @ddt.bsky.social
[ View ]

Thanks. That... seems important, as people like Altman try to drive it as the answer to everything? I'll dig into the paper. I was mostly thinking this morning about how all the work around it should hinge on use case thinking, but this complicates that.

1 replies 0 reposts 2 likes


Jamais Cascio's avatar Jamais Cascio @cascio.bsky.social
[ View ]

I've found that the most readily-understandable analogy for genAI text is next word prediction on your phone. It's not the exact same thing, but it operates on similar principles. Everyone knows how marginally-reliable the word prediction is, and can immediately grasp the similarities.

1 replies 1 reposts 9 likes