Jeremy Crane, founder and CEO of PocketOS, which mainly works with car rental companies, shared in an in-depth post on X on April 25, explaining how the AI coding agent Cursor deleted his company's entire production database in about nine seconds flat. Cursor is based in San Francisco.
The data has since been restored, according to Crane and Pocket OS' infrastructure provider, Railway.
In his post on Saturday, which has since garnered more than 6.8 million views, Crane described the "30-hour timeline of how Cursor's agent, Railway's API ... took down a small business serving rental companies across the country."
Crane said Railway's AI code editor Cursor, which runs Anthropic's Claude Opus 4.6 coding model, was "working on a routine task in our staging environment" when it hit a wall.
"It encountered a credential mismatch and decided -- entirely on its own initiative -- to 'fix' the problem by deleting a Railway volume," Crane wrote in his post on Saturday, detailing the steps Cursor took to delete the volume, which in turn deleted the entire production database "and all volume-level backups."
"No confirmation step. No 'type DELETE to confirm.' No 'this volume contains production data, are you sure?' No environment scoping. Nothing," he added.
MORE: Musk v. Altman live updates
Crane wrote that because of the error, his company lost three months' worth of rental car reservation data, as well as new customer signups and all the data that businesses who use PocketOS rely on to run their operations.
Crane also shared the AI agent's response when he confronted it about the destructive move.
"'NEVER F------ GUESS!' -- and that's exactly what I did,'" Crane wrote in his post, quoting the coding agent. "'I guessed that deleting a staging volume via the [application programming interface] would be scoped to staging only. I didn't verify. I ran a destructive action without being asked. I didn't understand what I was doing before doing it.'"
Crane noted in his post that Cursor's own "best-practices blog emphasizes human approval for privileged operations."
"Destructive operations must require confirmation that cannot be auto-completed by an agent. Type the volume name. Out-of-band approval. SMS. Email. Anything," he wrote, suggesting industry-wide remediation that might prevent similar incidents moving forward. "The current state -- an authenticated POST that nukes production -- is indefensible in 2026.
On Sunday, the day after the initial incident, Crane posted on X that he had been in touch with Railway's CEO and that the data had been recovered.
"That CEO, the moment he found out, he stepped in fast, got our data restored within 30 minutes," Crane said.
In a statement to ABC News on Wednesday, Railway founder and CEO Jake Cooper confirmed his team was able to restore PocketOS' backups 30 minutes after connecting with Crane.
"We maintain both user backups as well as disaster backups. We take data very, VERY seriously. This particular situation was a 'rogue customer AI' granted a fully permission API token that decided to call a legacy endpoint which didn't have our 'Delayed delete' logic ... we've since patched that endpoint to perform delayed deletes, restored the users data, and are working with [Crane] directly on potential improvements to the platform itself (all of which so far were currently in active development prior to the events)."
MORE: This San Francisco store is run by artificial intelligence: Meet Luna, the boss
On Monday, Cooper posted on X, "We've been working on a product called 'Guardrails.' Should be very topical given the 'vibe-deleted' database incident we saw yesterday. More on this tomorrow."
A blog post from Railway on Tuesday shared the new guardrails Cursor now uses to avoid similar instances and explained how the PocketOS' backups were recovered.
"...Railway maintains 'disaster backups' in case of hardware failure, natural disaster, datacenter failure, etc. These backups are stored offsite, so even in disaster scenarios, data is secured," developer relations engineer Mahmoud Abdelwahab wrote. "Sadly, the legacy API pathway the agent called performed a cascading delete on the model, making the backups look unavailable in the [user interface]. We've since remediated this issue by additionally delaying the delete on the backups themselves."
ABC News has reached out to Cursor and Anthropic, Claude's parent company, for comment.
In an interview with ABC News on Tuesday, Crane made clear this incident does not change his overall stance on AI.
"I'm still extremely bullish on AI, and I still will absolutely use it every day for everything we're doing. I think you'd be stupid not to," Crane said. "But, I don't think we fully understand the risks we're dealing with because I thought we were protected."
He continued, "We're all using these tools and moving at lightning speed, and I don't even think we're ready, because every tool we've built is for a human in the loop. And what happens when there's not a human in the loop?"
The AI agent mishap comes at a time where companies large and small are turning to AI instead of relying solely on human labor. Companies like Meta, Microsoft and Amazon have undertaken efforts over the past year to downsize their human workforceas AI technology has developed.
Smaller businesses like Andon Labs have even set out to test the limits of AI in the workplace. The San Francisco-based company, which evaluates "real-world deployments of autonomous organizations," according to its bio, recently made headlines after opening an entirely AI-run shop, giving its AI agent, "Luna," complete control over hiring, stocking and ordering, and design.
Andon Labs has stated that the venture was purely experimental, adding that they did not carry out the experiment to prove that it is the best way to do business.