Rendered at 20:11:44 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
conartist6 8 hours ago [-]
How can something with no morals, compassion, love, or loyalty "go rogue". Seems like a contradiction in terms...
They say you shouldn't anthroporphize the lawnmower, and I think that's what's being done with this story.
alex_c 8 hours ago [-]
‘Rogue’ hammer loses control and smashes carpenter’s thumb.
Cpoll 3 hours ago [-]
It's more like 'rogue' industrial robot arm loses control and smashes operator's skull. (There's a reason those robots have hazard markings and physical shutdown switches).
maxerickson 7 hours ago [-]
It would not be that strange to call a piece of mobile machinery that was out of control "rogue".
Probably a worse choice than simply calling it out of control, but not that strange.
cwillu 7 hours ago [-]
Because a system being in an unstable attractor of behaviour requires none of those things
It is at least as much of a mistake to reason about these systems the same way we do a misbehaving compiler as it is to reason about them as if they were conscious beings; at least the latter mistake (which is more or less forced by the lack of appropriate language) does not present the illusion that these sorts of behaviours are mere bugs and misspecifications that can be corrected by applying a chipper junior developer to the task.
conartist6 7 hours ago [-]
Makes me think of a hierarchy of alien-ness that Orson Scott Card used in Speaker for the Dead
“Styrka, Plikt, let me put you another case. Suppose that the piggies, who have learned to speak Stark, and whose languages some humans have also learned, suppose that we learned that they had suddenly, without provocation or explanation, tortured to death the xenologer sent to observe them.”
Plikt jumped at the question immediately. “How could we know it was without provocation? What seems innocent to us might be unbearable to them.”
Andrew smiled. “Even so. But the xenologer has done them no harm, has said very little, has cost them nothing-- by any standard we can think of, he is not worthy of painful death. Doesn't the very fact of this incomprehensible murder make the piggies varelse instead of ramen?”
Now it was Styrka who spoke quickly. “Murder is murder. This talk of varelse and ramen is nonsense. If the piggies murder, then they are evil, as the buggers were evil. If the act is evil, then the actor is evil.”
Andrew nodded. “There is our dilemma. There is the problem. Was the act evil, or was it, somehow, to the piggies' understanding at least, good? Are the piggies ramen or varelse? For the moment, Styrka, hold your tongue. I know all the arguments of your Calvinism, but even John Calvin would call your doctrine stupid.”
If indeed we need language for this, it would seem to me that AI is "varelse".
thunkle 6 hours ago [-]
Wait. Cursor had access to the production DB???
kstrauser 5 hours ago [-]
I mentally replace “AI agent” with “intern” when I read this stuff and it helps clarify the root cause. People are connecting directly into prod and making changes live? It’s not (solely) the fault of the actor, but of the whole process that makes it possible for the event to happen. It may be the case that there’s a break-glass situation where a specific person needs prod access to fix an urgent thing. In that case, there needs to be an approved plan like “I’m connecting to this DB, making this query to find the affected row, then running this one to fix it”.
If it makes you shudder to imagine allowing an intern to do a thing, you should shudder harder to imagine letting an AI — an intern who can type really fast — do it.
I work in AI. I love using AI. I don’t want to go back to not using AI. But darned if I’m letting anyone, human or AI, just waltz into a prod environment and make random changes.
ChiperSoft 5 hours ago [-]
It didn't have access to any db. In short: It went looking in the codebase for a credential to manage the staging environment, found a testing credential unrelated to anything it was doing, that the devs didn't know had permissions to administer anything, and then used that to delete the wrong db.
sharts 4 hours ago [-]
So what? If you have backups and disaster recovery in place then this should be a nothing burger.
They say you shouldn't anthroporphize the lawnmower, and I think that's what's being done with this story.
Probably a worse choice than simply calling it out of control, but not that strange.
It is at least as much of a mistake to reason about these systems the same way we do a misbehaving compiler as it is to reason about them as if they were conscious beings; at least the latter mistake (which is more or less forced by the lack of appropriate language) does not present the illusion that these sorts of behaviours are mere bugs and misspecifications that can be corrected by applying a chipper junior developer to the task.
If it makes you shudder to imagine allowing an intern to do a thing, you should shudder harder to imagine letting an AI — an intern who can type really fast — do it.
I work in AI. I love using AI. I don’t want to go back to not using AI. But darned if I’m letting anyone, human or AI, just waltz into a prod environment and make random changes.