Why Tokenization is More Important Than Ever

Since its introduction in 2005, tokenization has become a mainstay of the payments industry. Despite its ubiquity for more than a decade, there is still no official standard mandating how tokenization is deployed or even what defines a token itself. EMVCo and PCI have both attempted to standardize tokenization over the past few years, but neither has succeeded. Even these two organizations can’t agree on what tokens should look like and how they should function.

There are several different payment security methods that perform a variety of security functions, and they cannot all be accurately described as tokenization. “EMVCo tokenization” has become a topic for much debate in the payments industry. It can refer to both mobile payment tokenization (à la Apple Pay) and card-based tokenization. I even saw one article a few months back that classed point-to-point encryption as a tokenization solution. It’s enough to make almost anyone’s head spin — especially those of us who know what tokenization really is and the specific problem that it solves.

Normally, a little quibble over words wouldn’t be worth discussing, but confusion over the definition of a security solution is dangerous because it means that merchants are at risk of being led astray from the very tokenization solutions they need to secure their business. Point-to-point encryption as a token? Absolutely not. Tokenization was specifically designed not to be encrypted data; because by definition, encrypted data is potentially decryptable.

Making Crucial Distinctions
Rather than encrypting data, tokenization was designed as a random, globally unique, alphanumeric value that replaces payment card data after bank authorization. This makes it so the data stored in merchant systems has absolutely zero value outside of their environment. Tokenization works differently than encryption because each individual token is created when a transaction takes place, making it organically random with no mathematical pattern to be unlocked. Tokens were designed to never maintain a one-to-one relationship with a card (although we later built additional secure technologies that allowed for tokenized merchants to still track card usage for analytics).

Since tokenization was created to add security, a token should only reference a single transaction — not linked to the card as a constant. This differs from recent discussions that mistake true tokenization for security features that are actually driven by EMVCo tokenization, such as mobile wallets. Although they are referred to as tokenization, these services aren’t actually tokenization at all. Instead, they are consumer-based token services that seek to protect the cardholder — not the merchant. This is a noble undertaking, but slightly misguided, since having a token that always references the same card number has, in essence, done nothing more than create a new card number that is just as vulnerable to attack as the original data! This is not what tokenization was designed to do.

The True Purpose of Tokenization
Tokens protect merchants from the costly effects of a data breach. In fact, that’s why it was created. Business needs require some merchants to store transactional information after the initial transaction is processed to allow for returns, incremental authorizations, recurring billing, etc. For example, hotels would typically store card numbers from the time an initial reservation was made until checkout. This meant keeping hundreds — if not thousands — of card numbers on file. However, tokenization proves that sensitive, vulnerable card data doesn’t actually need to be stored, even in card-on-file environments.

The use of tokenization means that merchants can keep their daily business practices intact without having to worry about securing a large database full of vulnerable card data. Their breach profile is reduced such that they become an unappealing target for hackers.

My company has now processed more than 7 billion tokenized transactions. We could release a comprehensive list tomorrow that included each one of these tokens, and hackers would be no closer to breaching any of our merchants’ systems. That’s because of the organic nature of tokenization and the limits placed upon it when it was designed.

The same could not be said of these other “tokenization-in-name-only” solutions. Tokens with universal value — such as the consumer-based model — can be accepted by any retailer, but at the cost of universal risk. If one of these consumer-tokenization providers released a full list of tokens tomorrow, you can bet there would be an instant increase of fraud among merchants that accept them.

A United Security Front
It’s not that these consumer-based tokens can’t work. PayPal, Samsung and Apple have been successfully assigning them for years. They do offer a certain level of protection to cardholders at the point of purchase and have — knock on wood — been relatively effective in preventing mass-scale breaches. My contention with these technologies is simply that they should not be called tokenization. They are actually much closer to an encryption or cryptographic hash than they are to the concept of the arcade token, on which tokenization was originally based.

However, consumer-based technologies can work together with a true tokenization solution to accomplish greater security. Tokenization can and does tokenize — according to the original definition — the consumer tokens that are received from a mobile wallet or other payment instrument. This prevents the merchant from having to maintain a database full of sensitive cardholder data — even if that data in this case is an encrypted surrogate.

The fact is that merchants want to make it as easy and convenient as possible for customers to say “yes” and make that purchase, so they are eager to accept mobile wallets and other card-based token solutions. But remember, not all of these solutions offer true tokenization, which provides the highest level of security and protects both merchants and their customers.

About the Author
J.D. Oder II serves as Shift4’s CTO and SVP – R&D. J.D. is a Certified Network Engineer with more than 15 years of experience. He leads Shift4’s systems operations and development efforts as well as the security and compliance teams. J.D. is the architect of the DOLLARS ON THE NET® payment gateway solution. He is credited with introducing tokenization to the industry in 2005 and was also an early adopter/member of the PCI Security Standards Council.

Exit mobile version