From a non-tech person’s point of view, the AI discourse went very quickly from “there’s a 78% chance this will end civilization” to “we’re installing this in all dishwashers, stepladders, social sciences research, and novellas and you will be so so grateful”
Feels like there's a fairly massive disconnect between corporations that think adding "AI" to every product will make them more money and consumers that would really prefer a dishwasher that didn't chirp like an astromech droid or cost hundreds to repair.
They were only terrified of Roko’s Basilisk when they thought /their/ infinite future digital clones were on the line; things have changed now that they know they can push a button that tortures other actual people and makes a $50bn valuation fall out of the VC gatcha game in return.
My theory remains that the entire "look at how responsible we're being calling for regulation to prevent AI from becoming sentient and taking over" was an intentional attempt to distract regulators from the immediate real problems of reproducing bias, mass copyright violation, energy use...
From the perspective of someone who has a basic understanding of programming but also isn't actually in tech this to me seems like a rand() function with extra steps
AI was never going to end civilization, but the marketing of it as such distracted policy-makers long enough to establish a new consensus where, inter alia, copyright was largely abrogated for automatic ("mechanical") works, downloading one of the largest problems onto AI users from manufacturers.
I know tech stuff is always shoved out on the market before it's ready but like, we can see this failing in real time. Court cases based on AI surveillance are being thrown out and even McDonalds is abandoning their AI drive thru stuff already cause even aside from the ethics the tech isn't there
That's more or less what the "this will end civilization" people predict. "This will make everything incredibly convenient and prosperous until the moment before it wipes us out" is their model.
I wonder which is a stronger driving force:
a) it's an easy gimmick to convince dumb people with money to fund you
b) AI people are so desperate for their bubble to not pop that they're forcing it in every market possible to sustain their bullshit a little longer
People found that addibg "AI" to a pitch increased chances of successful funding.
Like my dog realizing he gets a treat every time he goes outside and not just when he needs to go outside.
Hit that pattern to the beat