Surveillance Pricing: The cost of being known
"I have nothing to hide" you say. You use Venmo with a public feed, you pay for everything with a card, and your crypto transactions sit on a public blockchain for anyone to see. So what?
Here's the thing: no one was really looking. A human scrolling through blockchain explorers can barely make sense of the noise. Even a professional auditor.
That era is done. AI's ability to scan our data goes far beyond our comprehension. It ingests, correlates, and ultimately, uncovers our story. The infrastructure is already live.
Your payment history is a diary you didn't know you were writing
Blockchain analytics firms like Chainalysis, Elliptic, Nansen, and Arkham already use AI and machine learning to trace funds, cluster wallet addresses, and build behavioural profiles. Nansen labels over 300 million addresses.
The card payment world is worse because the surveillance is centralized, opaque, and happening behind closed doors rather than on a transparent ledger. (Read more on how card companies are tracking your data in our next article)
"Yeah but I'm not doing anything illegal, so I'm good"
Soon, catching criminals will be the smallest use of these products.
Crypto wise, the moment you touch a centralized exchange, when you on-ramp fiat into crypto or off-ramp back out, your wallet address gets tied to your verified identity through KYC.
From that single anchor point, AI can see your entire transaction history and track behaviour you did not even think to question.
Now ask yourself who might want that profile.
Your government, obviously, tax authorities are already using blockchain analytics at scale. But also your employer running a background check. Your landlord deciding whether you're financially stable enough to rent. An insurance company assessing your risk profile. A vindictive ex. Amazon when deciding how much to charge you, specifically you, for toothpaste.
Understanding "Surveillance Pricing" will make you care about transaction privacy
If the profiling problem feels abstract, this next part will make it concrete.
We are entering the era of what the FTC calls surveillance pricing. The practice of using AI to sift through personal data and set individualized prices based on what an algorithm predicts you're willing to pay.
Not dynamic pricing, where everyone sees the same fluctuation based on supply and demand.
Personalized pricing, where two people looking at the same product at the same time see different numbers, because the system knows different things about each of them.
This is already happening.
The FTC launched an investigation into the practice in 2024 and published preliminary findings in early 2025. Delta Air Lines revealed it was testing AI-driven pricing through a partnership with Israeli startup Fetcherr, with plans to expand algorithmic pricing to 20% of its fares.
When the Delta president told investors they would use personal information to set an increasing share of prices, it triggered Congressional scrutiny and a letter from three U.S. Senators warning that the practice would push fares toward each customer's personal "pain point".
The AI models powering this don't just use your browsing history and location. They correlate purchase habits like browsing time and abandoned carts to other details like device type, and whether your phone battery is low.
A Consumer Reports investigation found eggs priced at five different levels at a single Safeway location. Booking.com uses modeling to decide which users get special offers, reportedly driving a 162% increase in sales.
Here is where it gets really dangerous: we are on the doorstep of AI agents making purchases on our behalf.
When AI shops for you, privacy becomes everything
In February 2026, Coinbase launched Agentic Wallets: wallet infrastructure specifically designed for AI agents to autonomously hold funds, trade tokens, and make payments without human approval at every step.
Days later, MoonPay launched MoonPay Agents, Visa has introduced a Trusted Agent Protocol. PayPal and OpenAI are partnering on agent checkout. Google's AP2 standard is gaining traction for both fiat and crypto agent payments.
The idea is simple: you tell your AI agent what you want, set parameters like a spending limit, and it handles the rest. Finding options, comparing prices, executing the transaction.
But think about what this means in a world of surveillance pricing with transparent financial data.
The merchant's AI knows your agent is coming. It can see your wallet's transaction history. It knows your financial behavior across months or years of on-chain activity.
It doesn't matter that you set a strict $800 budget for that ($700) laptop, the merchant's pricing algorithm has already calculated that you'd actually pay $780 based on your on-chain profile, and it prices accordingly.
You "saved" $20 against your budget. You overpaid by $80 against what someone with a private financial history would have been offered.
Scale this across every purchase your agent makes. Groceries, subscriptions, services, digital goods. You're bleeding value constantly, invisibly, because your financial life is an open book.
Even with strict parameters, your agent is negotiating from a position of total informational disadvantage. The merchant's AI has perfect visibility into your financial profile. Your agent has whatever budget ceiling you set.
So yes, you do have something to hide. Now go hide it already.
AI blockchain analytics are becoming more sophisticated every quarter. Surveillance pricing is spreading across industries. AI agents are being given wallets and spending authority.
Your public transaction history is the single most valuable dataset for anyone who wants to take more of your money.
The common argument, "I'm just buying coffee or a memecoin, who cares" misunderstands the nature of AI pattern recognition.
It's not about any single transaction. It's about the aggregate. Thousands of small transactions paint a detailed portrait of your income, your habits, your vulnerabilities, your vices, your schedule, and your willingness to pay.
You can't see the portrait by looking at individual strokes. AI can.
Transaction privacy on crypto payments isn't a feature for criminals. It isn't a nice-to-have for the paranoid. It is becoming a basic financial necessity, the equivalent of not handing every store you walk into a complete copy of your bank statements before you start shopping.
The people who figured this out early will have paid fair prices, kept their financial profiles private, and maintained their negotiating power in an AI-mediated economy. Everyone else will wonder why everything seems to cost them a little more than it should.
You don't care about transaction privacy today. But your AI agent will wish you had.
Can your social media username accept crypto & card payments yet?