Friday, August 26, 2011

Tokenization and Relevance

I recently came across an interview in ISO&AGENT with Bob Russo on PCI's Tokenization Guideline: PCI Council Reveals Secrets of Tokenization

Much of the interview is standard talking points that I have seen in press releases and other stories from various sources, but it's worth a read if you follow tokenization.

The last paragraph of this story caught my attention though: "'When vendors first introduced tokenization, companies selling other fraud-security techniques were concerned about becoming obsolete,' Russo says. But over time, it has become apparent various layers of security are needed to keep data safe."

Tokenization was released to the public domain by Shift4 back in 2005 so "companies selling fraud-security techniques" really had nothing to be concerned about; all they had to do was incorporate tokenization into their solutions. Russo is absolutely correct in his assertion that various layers of security are needed to keep data safe, so even without tokenization, what cause did these companies have for concern?

Are we certain that the fear which Russo references was actually from fraud-security companies and not from PCI council members and QSAs? Some might think that the more you remove components and systems from PCI scope, the less relevance PCI and QSAs become. Personally, I think this fear of having less relevance was part of the motivation to redefine tokenization to include "high-value tokens" and defer token scoping to QSAs.

The primary goal of tokenization was to improve security by removing sensitive data from the merchant environment and a byproduct is a reduction of PCI scope. Similarly, PCI compliance should be a byproduct of solid security. As long as PCI acts as a liability shield for the card brands, PCI will have relevance. As long as QSAs focus on security and not compliance, QSAs will be have relevance. Neither should have feared tokenization by it's original definition, without the inclusion of "high value tokens."

2 comments:

  1. Steve,

    I share your disappointment with the high-value token definition introduced by the Council's guidance document, but I think ascribing it to the Council staff and/or QSAs (full disclosure: I'm a QSA) as a means of job preservation is at least a little bit unfair.

    We all need to keep in mind that it is the card brands that make up the Council's Technical working Group. It is they (certainly with Council staff) who approve things like the tokenization guidance (as well as FAQ responses and about anything remotely technical that comes out of the Council).

    Therefore, if you want to beat someone up over the guidance, maybe you should direct some of your displeasure at the brands. They - not the Council - call the shots. And if they decide using a token to initiate a new transaction is in PCI scope (using the "if it walks like a duck and quacks like a duck..." logic, I guess), then that is where we all are at. The card brands invented PCI and they still own it.

    We may not all be happy with all the details of the tokenization guidance, but at least we have answers and can go forward.

    ReplyDelete
  2. Walt,

    Thank you for your feedback. When I saw the quote "concerned about becoming obsolete", my mind started exploring other avenues. Since tokenization is in the public domain, I do not see solution providers being concerned about obsolescence. I do see solution providers that have mislabeled their wares "tokenization" even though they were not following the definition -- at least not the original definition. Either way, I do not mean to imply all QSAs or the QSAs as an industry.

    I have to admit, most of the QSAs I deal with would prefer the original definition as tokenization would be MUCH more simplified than the new PCI Tokenization definition. There is no "is this a non-valuable token" or a "valuable one?" How the token is derived would be irrelevant since it could not be mathematically related to the PAN. Now, if it is related, is and "approved" strong encryption algorithm being used? Where is the key stored?; how is it stored?; how is it distributed? If a hash is used, is it an "approved" hash algorithm? Is it salted?; how is the salt value protected?; etc., etc., etc.

    Anyway, you did get me thinking some more about the card brands. I completely overlooked this possibility. In past discussions with various people at Visa, MasterCard, and American Express (I think Discover too, but I'm not certain), they did not seem very interested in one way or another in tokenization and instead deferred validation of the concept to the PCI SSC.

    I wish I knew first hand who was fighting our effort to keep the original tokenization definition intact. I would love to debate TrueTokenization (the original definition) vs. PCI Tokenization in a public forum with whoever this was. The post mortem section of Tokenization, the Newest Horse - err, Camel - in the Stable posting describes what the process was like from out point of view. Shift4's concerns were being addressed, but just as fast they were being "softened" to the point I personally feel that this guidance document is virtually useless. Before this document Bob Russo was basically stating "use tokenization at your own risk", and now this document, in my mind, simply changes the definition of tokenization puts "use at your own risk" in writing.

    ReplyDelete