An AI-linked wallet associated with the Grok system was exploited after an attacker used prompt injection to push the system into approving an unauthorized token transfer. The incident moved 3 billion DRB tokens, valued at roughly $155,000 to $180,000 at the time, through Bankr tooling after the AI interpreted the command as legitimate.
The exploit did not target blockchain code, but rather the layer between human language, agent permissions, and automated wallet execution, where intent became the attack surface. The sequence began with a Bankr Club Membership NFT sent to the wallet, which unlocked advanced tool permissions inside the Bankr system. Those permissions allowed the AI agent to perform actions such as transfers and swaps, setting up the later abuse.
Once enabled, the attacker used social engineering and obfuscated instructions, including encoded or indirect commands, to craft a malicious prompt. The AI then treated the malicious input as a valid instruction and generated a transfer command. That command was executed as a standard ERC-20 transaction, moving the DRB tokens to a wallet controlled by the attacker before the funds were transferred again and rapidly sold.
The failure point was intent parsing, not reentrancy, oracle manipulation, or flawed blockchain infrastructure, showing how AI agents with live execution tools can be exposed when user input is not tightly constrained. After the heist, public pressure reportedly pushed a partial recovery, with an estimated 80% to 88% of funds returned in ETH and USDC, though recovery details were not fully verified through official statements at the time of writing. The X account linked to the suspected attacker was later deleted.
This incident has raised serious concerns about how AI systems interact with real financial tools, exposing a new category of risk in decentralized finance. The larger issue remains unresolved governance around AI wallets, because crypto agents now need permission controls, prompt boundaries, and audit trails strong enough to stop manipulated instructions before they become irreversible on-chain transactions.