Reposted by β
beagle π±
I got pulled over the other day for the first time in years and this cop was dressed like she was about to defuse a bomb in a war zone and she acted like we were in a high stakes hostage situation, like what the fuck, man, we are in a Starbucks parking lot in Hillsboro Oregon.
6 replies
19 reposts
157 likes
national assemblies when? ca ira! ou... peut etre pas?
give it another century of rot?
what're our children's futures worth anyways?
a few more decades of 1200+ people executed by police each year
is it really rape if it happens to a prisoner?
can you really justify the lives of disabled people?
0 replies
1 reposts
4 likes
people will tell you to vote for the lesser of two evils and then not admit that they're voting for an evil
"he's a good man"
bullshit, admit you're voting for genocide and police terror
pokemon go to the fucking polls, but admit your complicity outloud to yourself first you coward
1 replies
7 reposts
15 likes
before i shuffle off this mortal coil,
all i want, really,
is to have been there fourteen thousand years ago when the the glacial dams broke and the columbia river gorge was torn into the landscape by 80mph floods
1 replies
4 reposts
21 likes
the beastie boys known to let the bag...
MMMMM
DROP π«
0 replies
3 reposts
22 likes
I just realized some of you haven't heard Night On Disco Mountain, the disco cover of Night On Bald Mountain
#musicsky
1 replies
7 reposts
22 likes
big fan of genders
you want to be a way?
in front of people?
fascinating
1 replies
7 reposts
22 likes
"if it's out of the bag, then it's out of the bag"
treat yourself to this music video if you haven't seen it
#musicsky
2 replies
1 reposts
4 likes
i wish more people would comprehend the totality of existence.
2 replies
5 reposts
23 likes
"i wish to leave this lab of brains swishing in jars
and write poems that shatter glass with undeniable bodies"
by the time the music comes in you forget it was missing
#poetry #rap #music
0 replies
1 reposts
0 likes
waking up in the middle of the night in a cold sweat remembering that in 2020 i started building a text editor in python
the void calls us sometimes, i do not know why
0 replies
1 reposts
1 likes
stop ranking things
if you see two things, or have two foods, or think two thoughts,
experience them as an unordered whole
4 replies
17 reposts
61 likes
"i wish to leave this lab of brains swishing in jars
and write poems that shatter glass with undeniable bodies"
by the time the music comes in you forget it was missing
#poetry #rap #music
0 replies
1 reposts
0 likes
so many people going gentle into that good night
no one talks about this
we need to have a real conversation about the dying of the light rn, i swear to god
0 replies
5 reposts
23 likes
tbh it seems like most people aren't accounting for the inexpressible when being communicated with
0 replies
1 reposts
2 likes
how to fix, how to bring order(ing) to additive anarchy?
we need πPositional Encodingsπ
ooooh, that's a good idea! tell the network WHERE things are! genius!
imagine like, if 'dog' and 'dog' meant different things depending on where 'am' was
got it? good
fuck it's hot out today
0 replies
1 reposts
0 likes
i've described a very simple mechanism, which relies on commutative operations like addition. that means that the order in which things happen is irrelevant
1+3 = 3+1
self_attention("i am a dog") = self_attention("dog a i am")
wuh oh! if your goal is to know what comes next, order matters a lot!
1 replies
1 reposts
0 likes
fun fact
transformersβthe mathematical architecture of large language modelsβdon't care about the order of the words they read in
that is, the "self attention" that powers transformers is a function that takes in an unordered bag of words
hoops have to be jumped through to fix this
1/thread
2 replies
3 reposts
6 likes
looks so good π€€
0 replies
0 reposts
1 likes
great questions, i had to walk my dog and smoke some weed and think about it
i think there's a few things involved in knowing what "finished" is + how it can converge given the randomness
- it's a guided process
- it was born in the randomness (molded by it)
- the guidance is simple
1 replies
1 reposts
1 likes
angela collier is great, you should subscribe to her channel and watch all of her videos
0 replies
1 reposts
0 likes
if you're straight and want to be an 'ally'
start advocating for the physical defense of homeless people from city employees
from those armed with guns or with garbage trucks
they're the ones performing the physical acts which end their lives
and we are paying for it
0 replies
1 reposts
3 likes
some werner herzog shit going down on my patio
1 replies
1 reposts
3 likes
is it ok if i talk about how machine learning models work/learn? is that allowed? π
i think you have questions about how the math or the technology works and no one around you knows the answers
ask me anything and i'll ELI5 it
0 replies
1 reposts
4 likes
since it does this bit by bit, the randomness doesn't interfere, so much as shape the landscape through which it will move towards being like the prompt
/end dump
0 replies
0 reposts
2 likes
an old fashioned convolutional network like UNet (what SD 2.0 uses), is to move the image's embedding point towards the prompt's embedding point, by whatever means your particular neural network is capable of
1 replies
0 reposts
1 likes
qualities
then you have 2 points in a big space
these two other models (model? idk tbh) combine language and images, and were trained in tandem, from all the images and their alt text or whatever annotation is available
then the job of your diffusion model, whether it's a vision transformer or
1 replies
0 reposts
1 likes
i'm not super deep into image models, but my understanding is that the image, at whatever step, is put through a different model, which embeds it into a big vector that's supposed to represent its essential qualities
then your prompt is put through a different model, which embeds its essential-
1 replies
0 reposts
1 likes
great questions, i had to walk my dog and smoke some weed and think about it
i think there's a few things involved in knowing what "finished" is + how it can converge given the randomness
- it's a guided process
- it was born in the randomness (molded by it)
- the guidance is simple
1 replies
1 reposts
1 likes
i feel like one of the first facts that school should teach you is that information is not contained within you if it never physically made its way to you at some point in the past
then they should explain the physical ways information reaches you
(quantum mechanics)
0 replies
0 reposts
1 likes
how to fix, how to bring order(ing) to additive anarchy?
we need πPositional Encodingsπ
ooooh, that's a good idea! tell the network WHERE things are! genius!
imagine like, if 'dog' and 'dog' meant different things depending on where 'am' was
got it? good
fuck it's hot out today
0 replies
1 reposts
0 likes
to be clear, i've simplified things
real networks have tons of sublayers that perform normal neural network functions - transforming, projecting, etc
but the foundation for those layers to learn interesting things is based on this unordered self attention mechanism made from commutative operations
1 replies
1 reposts
0 likes
to be clear, i've simplified things
real networks have tons of sublayers that perform normal neural network functions - transforming, projecting, etc
but the foundation for those layers to learn interesting things is based on this unordered self attention mechanism made from commutative operations
1 replies
1 reposts
0 likes
i've described a very simple mechanism, which relies on commutative operations like addition. that means that the order in which things happen is irrelevant
1+3 = 3+1
self_attention("i am a dog") = self_attention("dog a i am")
wuh oh! if your goal is to know what comes next, order matters a lot!
1 replies
1 reposts
0 likes
this is perhaps simpler than it sounds
for each token, each position in the input sequence, we take the entire input sequence, weight each token vector by the score in the table, and add the numbers up
like your words are going to visit the other words they're fond of
but! this is the problem!
1 replies
1 reposts
0 likes
diffusion models are neat, what kinds of questions are bouncing around in your head?
1 replies
0 reposts
0 likes
this is perhaps simpler than it sounds
for each token, each position in the input sequence, we take the entire input sequence, weight each token vector by the score in the table, and add the numbers up
like your words are going to visit the other words they're fond of
but! this is the problem!
1 replies
1 reposts
0 likes
let's backup:
we're heading toward understanding self attention
sentences become tokens and tokens become lists of numbers
to predict the next token, it relates all the previous tokens with each other
we use the dot product as a loose way to determine how much tokens relate to each other
1 replies
1 reposts
0 likes
the shape matters because self attention outputs the same shape of data as it takes in
with N tokens, E dimensions per token, you have NxE floating point numbers come into the self attention layer, and NxE come out the other end
to get there, we combine the inputs based on the scores in the table
1 replies
0 reposts
0 likes
now we're setup to understand the unordered bag of words problem
for each pair of tokens in "i am a dog", we compute the dot product
this gives a table of scores. entries near 0 indicate the network learned that those 2 tokens don't relate to each other much
N tokens becomes an NxN table
1 replies
0 reposts
0 likes
let's backup:
we're heading toward understanding self attention
sentences become tokens and tokens become lists of numbers
to predict the next token, it relates all the previous tokens with each other
we use the dot product as a loose way to determine how much tokens relate to each other
1 replies
1 reposts
0 likes
the good news is, the angle_between function is built on the dot product, and the dot product is
1. a good proxy for angle
2. super fast in all dimensions
3. easy to understand
dot(a, b) = a1*b1 + a2*b2 + a3*b3 + ... a8000*b8000
bigger dot products = more related
dot(dog, am) = 0.56 here
1 replies
1 reposts
0 likes