10 years ago, data-at-rest encryption was adequate for many data security needs. File encryption met most compliance requirements, checked the “yes we encrypt” box, and was relatively easy to implement. The growth of self-encrypting drives in large storage arrays provided a hardware-based answer to data-at-rest encryption. Together, these have been the go-to solutions for enterprises wishing to implement data security.
Tokenization, however, is emerging as a newly preferred method for securing data. Changes that are driving an alternative to file encryption include:
- Compliance requirements like GDPR, CCPA, and emerging regulations are more surgical, requiring greater controls to be placed on sensitive and regulated data.
- Threats are more sophisticated, and file encryption is quickly becoming inadequate to offer protection of data.
- Cloud-native applications that run PaaS services or leverage a microservices architecture may not have the capability for “file level” encryption.
There is a new breed of solutions that are making data tokenization, masking, and de-identification easier than ever. What is driving this growth, and why is tokenization rapidly a go-to method for data protection?
Data is protected in use
Tokenization enables the use of data even as it remains protected. This generally isn’t possible with encrypted data, particularly if AES modes like GCM are used. Tokenization, on the other hand, enables data to be treated as consistent tokens, which enables certain analytics and processing without detokenizing the data. Likewise, data can easily be retokenized when sharing with third parties, so even internal tokens aren’t revealed in data-sharing use cases.
Deliberate security and access controls
While some file level encryption capabilities offer advanced access control policies, tokenization solutions normally have this built in, with much more control. Rules can be built about how tokens are to be created, which data to preserve in the token (for example, keep the last four digits of a credit card clear for validation), how to mask detokenized data, etc. These rules enable surgical treatment of sensitive data elements that enable use by different teams and applications.
This is a value for address compliance requirements like GDPR, which require user-level controls and policies. Tokens can also natively support tagging, which creates more control in terms of how tokens are audited. For example, tokens can be tagged by region or geography, or by environment. Together, these capabilities help security teams address regulatory requirements.
Security policy as code
As more and more security teams adopt a SecOps approach, security policies are most effective when they are baked into the application development pipeline and orchestration, from proof-of-concept to production. Tokenization is easily scripted and automated, enabling deep integration with application development teams.
All security leaders know the easier security is to implement for engineering, the better it will be adopted. While tokenization does represent a shift in data management, it can align very well with normal DevOps practices.
With so many cloud applications being built as microservices and in PaaS environments, the days of file encryption and operating system controls are long gone. Security teams need solutions that integrate with modern applications that are cloud-ready and API-friendly. Deploying agents, SDK’s, and other legacy software artifacts are simply not possible in many cloud platforms, which is a positive thing.
Tokenization enables teams to protected data native to the cloud, using services aligned with cloud deployment and orchestration.
Extract greater value from data
Ultimately, tokenization is about extracting value from secured data, instead of relegating it to locked down data that is “off-limits.” Tokens create opportunities to leverage secure data – whether it’s payment information, or personal health information – in usable and operational ways, without compromising the safety of the customer or the data.
A Breadth of Options
Perhaps the only downside to this growing popularity of tokenization is the breadth of options available to organizations wishing to adopt it. The new breed of tokenization solutions offer a rich variety of options, all of which must be evaluated to understand how it fits within your architecture and roadmap.
Fortanix, for example, offers a tokenization solution tightly integrated with their key management platform. Newcomer Sotero has focused on seamless integration, offering a nearly “transparent” solution for data tokenization. Data security powerhouse Thales brings a unified data security platform that offers many controls including file encryption and tokenization, just two of nearly a dozen integrated products. Google has even developed tokenization capabilities native to it’s Cloud Data Loss Prevention service. There are many other vendor solutions in addition to these.
Integrating tokenization for your applications can be daunting, given the many options available. When evaluating data tokenization, our advice is this:
- Develop a strong data-flow plan that maps out how data is ingested, processed, stored, and then used. Tokenization likely touches on each of these stages of data management.
- Thoroughly understand how tokenization impacts each of those stages. For example, when ingesting data, how will you tokenize it? When running reports, do you need to detokenize, or will the reports run just fine with tokenized data?
- Evaluate your options. As we indicated, there are very compelling solutions in the space, but some will integrate better with your environment and better meet your requirements than others.
- Get help if you need it. Sidechain has experience with many tokenization solutions, and we have built tokenization implementations from the ground up. If you need guidance, our experts can de-risk your projects, and ensure it’s implemented right the first time.
If you want to discuss your tokenization efforts, we’re happy to be a sounding board and offer our suggestions. Contact us to set up a time to chat.