Friday, August 26, 2011

Tokenization and Relevance

I recently came across an interview in ISO&AGENT with Bob Russo on PCI's Tokenization Guideline: PCI Council Reveals Secrets of Tokenization

Much of the interview is standard talking points that I have seen in press releases and other stories from various sources, but it's worth a read if you follow tokenization.

The last paragraph of this story caught my attention though: "'When vendors first introduced tokenization, companies selling other fraud-security techniques were concerned about becoming obsolete,' Russo says. But over time, it has become apparent various layers of security are needed to keep data safe."

Tokenization was released to the public domain by Shift4 back in 2005 so "companies selling fraud-security techniques" really had nothing to be concerned about; all they had to do was incorporate tokenization into their solutions. Russo is absolutely correct in his assertion that various layers of security are needed to keep data safe, so even without tokenization, what cause did these companies have for concern?

Are we certain that the fear which Russo references was actually from fraud-security companies and not from PCI council members and QSAs? Some might think that the more you remove components and systems from PCI scope, the less relevance PCI and QSAs become. Personally, I think this fear of having less relevance was part of the motivation to redefine tokenization to include "high-value tokens" and defer token scoping to QSAs.

The primary goal of tokenization was to improve security by removing sensitive data from the merchant environment and a byproduct is a reduction of PCI scope. Similarly, PCI compliance should be a byproduct of solid security. As long as PCI acts as a liability shield for the card brands, PCI will have relevance. As long as QSAs focus on security and not compliance, QSAs will be have relevance. Neither should have feared tokenization by it's original definition, without the inclusion of "high value tokens."

Wednesday, August 24, 2011

Tokenization, the Newest Horse - err, Camel - in the Stable

As the old saying goes, "a camel is a horse designed by a committee." This saying perfectly describes the recently published PCI DSS Tokenization Guidelines from the PCI SSC. While the original intent of the document was a noble one, the final version fell way short.

The problem with having multiple blogs is to post onto multiple blogs. To avoid content duplication, please go here to read my post: http://blog.shift4.com/2011/08/tokenization-the-newest-horse-err-camel-in-the-stable.html