High-profile data breaches at major retailers exposed thousands of consumers’ personal account numbers (PANs), spurring the adoption of tokenization—a solution that replaces sensitive account data with surrogate values, protecting both consumers and merchants.
As tokenization scaled, its benefits proved to extend well beyond fraud prevention. Merchants often saw meaningful lifts in authorization rates. But the rise of competing token types, the emergence of agentic commerce, and evolving policies from industry leaders have made tokenization strategy more complex than ever.
In a recent PaymentsJournal podcast, Kiel Cook, Principal Product Manager at IXOPAY, and Don Apgar, Director of Merchant Payments at Javelin Strategy & Research, explored tokenization’s performance advantages—and why the next phase of change represents an opportunity for merchants to take the reins of their payments destiny.
Avenues to Authorization
As demand for tokenization increased, card networks introduced network tokens, payment service providers (PSPs) issued proprietary tokens, and third parties developed universal tokens to bridge ecosystems. For a time, the industry speculated about which format would ultimately prevail.
“The different forms of tokenization were pitted against each other as a this-or-that scenario in the beginning,” Cook said. “But over time, especially in 2025, what I realized was these are actually a better-together play. Ultimately, when we’re talking about payment credentials, we’re talking about authorization rates. Network tokens are a trusted source and typically increase the likelihood of avoiding soft declines.”
“But there are still scenarios where the network token may fail or may not be the most apt payment credential to use,” he said. “Those who are positioned to pivot back to the PAN when needed are the ones that are going to win. The more avenues you have to obtain authorization rates, the better.”
Beyond security and authorization benefits, tokens are persistent. They stay current even when underlying cards expire or are replaced. This reduces unnecessary declines in card-on-file and recurring payment scenarios.
Tokens can also serve as a common denominator across P2Ps, acquirers, and regions. When paired with payments orchestration platforms, they unlock operational flexibility and significant efficiency gains.
Together, these advantages make tokenization foundational to modern payments infrastructure. Yet rapid adoption has also surfaced new pain points for merchants.
“As the merchant landscape and consumer shopping started to evolve into omnichannel and then mobile, merchants would go with best-of-breed providers and sometimes wind up with multiple tokenization stacks,” Apgar said. “When you now want to change PSPs or you want to make a change to a sales channel or bolt on another vendor, it becomes a real issue if you don’t have control over the token.”
The Question of Ownership
For small businesses just getting off the ground, token ownership is rarely top of mind. Payments services are often lumped into the broader cost of doing business.
“It’s usually not until an issue arises with their PSP, such as downtime or some new technology gets launched into the market and their PSP doesn’t have that,” Cook said. “Then they’re looking to move and they realize they don’t have the authority to make those decisions; they need the permission of their provider in order to take their data and put it somewhere else.”
“In that moment, the question is, ‘Do you own your data? Do you have control? Can you do what you need to do to drive efficiency, to increase your bottom line with your customers, to increase your brand recognition, to have a robust payment connectivity layer?’” He said.
That calculus changes as merchants expand and integrate multiple PSPs. At that stage, token ownership directly impacts portability, routing flexibility, and negotiating leverage. In short, whoever controls the token controls critical aspects of the payment relationship.
“How much autonomy would you like to have in your payments decision?” Cook said. “That’s going to help you understand how important ownership of your own data is going to be for you. Those who own their payment credentials own their own destiny.”
The Tokenization Mandate
Payment credentials remain incredibly powerful and increasingly difficult to safeguard amid rising fraud sophistication. To strengthen protections, Mastercard has committed to tokenizing all e-commerce transactions by 2030.
While many support the spirit of this mandate, merchants are struggling with its practical implications. Credit cards will still be widely used in 2030, and issuers will continue to provide PANs to consumers.
However, PANs will likely play a diminished role in the transaction lifecycle. That shift makes universal, merchant-driven tokenization essential—not only for protecting customers, but also for maintaining PCI compliance.
“The 2030 mandate is more of a requirement to convert a PAN to a network token because I don’t see PANs being completely removed from the ecosystem by then,” Cook said. “Digital wallets will continue to expand because merchants will start to receive more network tokens through avenues or rails that are out of their control.”
“But there will still be times where someone who’s on the other side of the digital divide that hasn’t adopted a digital wallet and is still coming in trying to process with their PAN,” he said. “The onus will be on the merchant in those scenarios to have the avenues to convert PANs, when they do receive them, to network tokens.”
Developing Agentic Trust
A more proactive tokenization strategy is becoming critical as the payment ecosystem approaches another inflection point: the rise of agentic AI. These autonomous agents are poised to become a mainstream shopping interface.
“We’re going from one payment credential—historically the PAN—to now a proliferation of payment credentials and line of sight to where these are coming from,” Cook said. “How do you know what to trust and what not to trust? How do you know the difference between an agentic agent that has permission versus a bot hitting your website?”
“One of the big things is making sure that you as a merchant have your data stored in a way so that the agent can pick it up and share it with the consumer on the other side of that search,” he said. “Not having your data in the correct format or being able to be picked up in a certain way is going to be a big challenge for your company to maintain line of sight to your consumer, as they have a new middle layer managing the interaction.”
This highlights a new core challenge—trust. Merchants must verify not only the consumer, but also the AI agent acting on their behalf, along with permissions and intent behind each transaction. Meeting this need will require new infrastructure capable of assessing and managing agentic risk.
Tokens can play a pivotal role by creating guardrails around agent-driven activity. Merchants should begin preparing now to support agentic-ready token frameworks.
“Keep in mind, it’s just a different version of a network token, which are just payment credentials,” Cook said. “Universal tokenization should be looked at as, ‘I’m about to get bombarded with payment credentials that are scheme-persisted. I don’t control the usage; I don’t control the relationship; these things weren’t built with me in mind. What was built with me in mind? What is my tool to anchor myself?’ That’s universal tokenization.”
“That’s the playbook that I would put out there for merchants to leverage to protect themselves,” he said. “It’s making sure that they have line of sight to who is who and having something that they can drop directly into their ecosystem without having to re-architect their entire payment stack in order to be relevant in the agentic commerce world.”
The Tactics Are Changing
The rapid evolution of payments—especially the acceleration of generative and agentic AI—has created urgency for many merchants to modernize. While adopting new technologies is important, strategy must remain grounded.
“If you go back 10 years ago, we were in the same place with tokenization and everybody rushed to tokenize as a stopgap security measure—only to find out down the road that I now need a more holistic strategy around how I use tokens and what benefits they give me beyond security,” Apgar said.
“That’s where we are with AI, too,” he said. “My advice to merchants would be slow down the conversation and understand what AI means for your business, for your customers and your data security—and try to put a strategy around all of this.”
At its core, any tokenization roadmap should be a natural extension of a company’s broader mission: protecting customers, optimizing performance, and maintaining control in a dynamic ecosystem.
“We’re talking about consumers making a purchase and merchants receiving a payment credential and maintaining line-of-sight to their customer for loyalty plays, security plays and so on,” Cook said. “This is what we’ve always been doing; the tactics are just changing. This is change management. Are you paying attention to the things that are changing? Do you see the incremental adjustments that are occurring and are you adjusting as you go?”
“If you have a rigid approach to your processing stack, that’s when things will become detrimental,” he said. “At the end of the day, no one can see what’s on the other side of the 2030 line. The best thing that you can do is put yourself in a flexible, future-proof payment stack so you’re prepared for whatever payment credential that comes on the other side.”








