The word “cloud” is sometimes overused in IT—and lately, it’s been tossed around more than a football during a tailgating party. Be that as it may, organizations still want to implement cloud-based initiatives. But securing assets once they’re in the cloud is often easier said than done.
In May, more than half the respondents to a survey from Symantec and the Cloud Security Alliance (CSA) said they were less-than-somewhat prepared to secure public cloud services. This article will attempt to explain some of the core concepts of cloud security, using a mix of expert commentary, with additional resources and links.
Regulatory requirements (a.k.a. Data Residency)
Data residency (sometimes known as data sovereignty) is a potential area of confusion for many organizations. One of the basic principles of cloud computing is the ability to compute without borders. So if an organization needs to worry about where its data resides, or how the latter travels from point A to point B, then the idea of borderless anything is shot to pieces, right?
Every day, someone somewhere uses data residency as leverage in a cloud-based security sales pitch, knowing full well that the fear of regulatory fines and other legal issues related to data governance creates a potential pain-point for an organization; one that the perspective vendor is only too willing to fix.
“The first critical step is to look inward at your own business model to understand the organization and how it performs business, where, and with whom… this is critical in understanding whether your cloud computing strategy will even require you to think about data residency,” explained Rafal Los, Security Evangelist for HP Software.
As such, he added, organizations must consider data storage, backups, geographical computing, networking and connectivity logistics… all of which blend together to deliver IT services. From that foundation, an organization’s cloud strategy should include a solid understanding of the legal and regulatory environment in which it operates. Doing so allows business leaders and IT teams to understand whether data residency rules even apply to a particular cloud-based project.
Los also cited the need to “always perform proper due-diligence on any vendors or service providers your organization may be doing business with.” Doing so allows an organization to “understand not only supply-chain complexity, but also whether they are aware of the rules and regulations under which your organization is required to operate.”
He also mentioned encryption as a consideration when dealing with data residency issues: “As in Canada with the protection of medical records of Canadian citizens, data residency rules are often strictly enforced unless the data is encrypted such that it can only be read within the physical boundaries of the nation or state within which they are governed.” Data encryption is important not only for security, but also privacy.
In the U.S., the protection of patient records and medical data is a hot topic, especially given a widespread desire for faster and more distributed access to personal medical information. How are healthcare organizations dealing with cloud initiatives while managing data residency issues?
“When it comes to public cloud, if you look at all of the available public cloud providers and compare them to what the HIPAA Security and Privacy rules say you have to be able to attest to in order to be compliant—it’s really clear that public cloud and electronic protected health information (PHI) don’t co-exist,” Martin Fisher, Director of Information Security at WellStar Health System, said in an interview.
For the curious, WellStar Health System is a not-for-profit integrated health network in Atlanta, Ga., with five hospitals, more than 13,000 employees, and roughly 100 locations encompassing urgent care centers and doctors’ offices. Its current cloud-initiative focuses on messaging, in particular the migration of email to the cloud. Doing so will allow WellStar to reassign resources that were once focused on email management to other projects.
When it comes to moving PHI to the cloud, many healthcare organizations tend to avoid it altogether, as it would mean that the cloud provider becomes a business associate. “When you sign a Business Associate agreement, there’s a level of liability that the business associate accepts,” Fisher explained. “They openly acknowledge they have to operate within the HIPAA security rule like any covered entity. Understandably, none of the current cloud providers are willing to do that.”
That fact that none of the current cloud providers are willing to host ePHI due to regulatory concerns and liabilities only proves the points made previously by Rafal Los, begging the question of what else a cloud provider won’t allow.
Several marketplace vendors offer products designed to deal with residency and regulatory concerns. Hewlett-Packard’s Converged Cloud, for example, can tie public, managed, and private cloud deployments into traditional IT infrastructures, which in turn are operated and controlled from a central point.
Vendors that focus heavily on cloud-security aspects include Voltage Security, which is noteworthy for helping Heartland Payment Systems develop E3, a payment system with end-to-end protection of cardholder data (and helps merchants remain outside of scope for PCI audits). E3 was developed after Heartland suffered a disastrous data breach, and uses Voltage’s SecureData and FPE (Format Preserving Encryption) technology.
There’s also CipherCloud, which focuses on the upper tier of the SMB (small- to midsize-business) market. Its security offerings work with Salesforce, Force.com, Amazon (EC2 &S3), Box.net, as well as GMail and Office 365 (via API). The main product on offer is the CipherCloud Gateway, which offers Tokenization and Function and Format Preserving encryption.
As most vendors correctly point out, the easiest way to deal with data centric regulatory requirements is to remove the data itself from scope. With HIPAA, for example, there are two ways to remove PHI from scope: tokenization and encryption. The same principle applies to PCI-DSS as well. If it is removed from scope, then data that is encrypted or tokenized isn’t governed by a regulatory rule.
This seems like a simple solution, but it’s not. Not everything can be taken out of scope and in some cases the difference between tokens and encryption is big one. In addition, organizations are under the false assumption that some methods of hashing count towards encryption efforts, which they certainly do not. Data protection is one of the key elements to residency and regulatory compliance, and we’ll explore that in the next section.
As mentioned, dealing with residency issues within the cloud often means encrypting data. Yet even if residency issues do not exist, data needs to be protected at all costs. A recent study from Symantec (PDF) suggests that digital assets (data in all of its various types and levels of sensitivity) account for half of an organization’s value. On average, an SMB has about 563TB of it, while an enterprise has 90 petabytes or more.
All of this data needs to be accounted for and protected. The go-to method is naturally some level of encryption—and that’s where things can get messy.
“There is almost a mysticism that surrounds data encryption. Very few people can wrap their heads around the math that makes crypto do its thing,” said Josh Shaul, CTO of Application Security Inc. and author of one of the most widely referenced books on Oracle security.
When it comes to data encryption, there’s plenty of published research on cryptology and implementation methods. That’s led to the widespread belief that, if data is encrypted, then it must be protected. But as attackers get more creative, most administrators don’t realize something is broken until it’s too late. As a result, corporate data can end up stolen and published to the Web.
But it takes a whole lot more than a solid cryptographic algorithm to make a crypto system provide security, Shaul added. For example, consider key management. If an organization isn’t protecting its encryption keys, then even the strongest algorithm can be rendered worthless. Another problem in data encryption is the perception that it’s hard to implement.
“Ask almost anyone on the operations side of the IT shop, and they’ll tell you that encryption slows application and database performance, causes delays in development and testing, and can put SLAs for critical applications at risk… But the truth is, for many of them, their views on crypto were built through (painful) experience,” he said.
Most of this pain comes from the poor understanding of when and how to deploy encryption, and Shaul freely admits that some less-than-great products built on this mindset have shipped to market in the past. Ironically, perception issues can lead to inconsistent or non-existent deployment of protection.
In Shaul’s opinion, this explains why attackers have been so successful in hijacking sensitive data from organizations. Many of those victims spent years building up endpoint protection in the form of firewalls and IDS/IPS systems. These technologies have their place, and offer a tremendous value as a layer or protection, but that’s all: they’re just a layer.
“To solve this problem, we need to take a different approach,” he said. “We’ve got to stop chasing our tail in the never-ending battle to protect the network perimeter. I’m not advocating dumping the firewalls or AV—it’s just time to move the focus to the stuff that really matters: the systems that store and process our most sensitive data.”
That means making it more difficult for attackers as they get closer to an organization’s most valuable information: protection not only at the perimeter, but also within the data center.
Encryption vs. Tokenization
As mentioned, most organizations look to encryption or tokenization to remove data from regulatory scope. This line of thinking didn’t come out of the blue, because vendors leverage it in their sales pitches. Sometimes (if the vendor is able to accomplish it), an organization may end up pitched on a hybrid of the two options. But which presents more value to the organization?
To be fair, neither option can claim to hold more of a value proposition. Ultimately, their value to the organization will depend on implementation. Tokenization is great for reducing PCI scope, whereas systems using data encryption to protect cardholder data will remain in scope for PCI.
“Tokenization can pull a tremendous amount of sensitive info into the token server, necessitating a whole new level of security for that system,” Shaul said. “That’s something organizations need to be prepared to deal with. Encryption is applicable to a far broader range of security problems, but the security it offers is only as good as the systems used to store and manage the encryption keys.”
Organizations need to be prepared to deal with the differences between the two options and respective implementations. Also worth mentioning is the fact that no matter what the sales pitch says, neither option is easy or cheap.
“The most important thing to remember when you’re storing or processing sensitive data in the cloud is that you are still fully responsible for the security of the data, and you are fully accountable if that data is lost or stolen,” Shaul concluded. “Even if your cloud provider offers some security services or indemnifies you for losses resulting from a breach, if your data is stolen, it’s still your problem.”
In addition to Voltage and CipherCloud, other vendors can assist in organizations’ encryption needs:
Thales is a bit of a one-stop-shop type of vendor, with fingers in all areas of security. While this isn’t a bad thing, it can make understanding all the available offerings a bit of a time-consuming chore. But it does have a solid reputation for database encryption.
Vormetric was one of the first vendors to support Intel’s AES-NI for faster execution of encryption / decryption operations on DB2, MSSQL, Oracle, Informix, and MySQL. It can address the encryption and key-management needs of an organization of almost any size. However, like Thales, Vormetric isn’t a one-size-fits-all shop; research is needed to ensure that its offerings can fit within an organization’s overall business goals for the long term.
DAM (Database Activity Monitoring)
Vendors with true DAM offerings are not as common as one would think. Yet when layered atop a given encryption deployment, DAM applications can help administrators sleep easier at night.
GreenSQL isn’t the largest or the flashiest DAM vendor on the market, but it stands out by offering two things: a legit DAM offering and a simplistic approach to database security. It also incorporates a few extras into the process, and the basics are free to use. It currently supports the MSSQL, MySQL, and PostgreSQL platforms.
Like GreenSQL, Application Security Inc. is a firm that adds additional layers of security to any encryption implementation. Its main product, dbProtect, allows enterprise operations to discover sensitive data within databases on the corporate network—a crucial first step, considering how you can’t protect data you don’t know exists in the first place. From there, database vulnerabilities can be addressed and mitigated, access controlled, and future activity monitored.
Data Access (Internal & External)
So far, the notion that data must be protected in the cloud has been covered with regard to the technical and legal aspects. Clearly, encryption is a must-have. But all that newly protected data must be accessed at some point—either from within, by an employee, or from without, by a customer or business partner.
While data should be protected at rest and in motion, access is unavoidable—it’s what’s supposed to happen, after all. So what can an organization do to ensure that all access is authorized?
The key here is Identity Management (IdM), which includes such features as passwords and end-user permissions. If layered properly, they will work seamlessly with any data protection method available. If implemented incorrectly, they can become a nightmare to manage, and they will offer little value to the organization.
“If you can’t do those things well, it seriously impacts everything else you do. HIPAA says that every time someone interacts with PHI—from a view, modify, or whatever perspective—you need to accurately log the individuals who did that,” said WellStar Health System’s Martin Fisher.
When asked if their organization monitored privileged account access, 43 percent of the respondents to a Cyber Ark survey either didn’t know or said no. Of those who did monitor privileged access, 52 percent said they could get around those controls.
Any IdM system needs to offer the ability to create strong policies, present detailed and accurate logging, and be easy to manage. If it’s too hard to operate, it’s going to be ignored. If it isn’t easy to administrate, then it will be poorly configured and end up bypassed—accidentally or otherwise.
A system that fulfills those requirements isn’t as easy to find as it sounds. One of the things that can make good IdM problematic is the nature of business itself. Specialized applications and infrastructures can’t rely on cookie-cutter solutions and one-size-fits-all offerings. Before an IdM can be implemented, a clear understanding of the organization’s goals and needs is required. Another thing to consider within the planning stage of an IdM program is separation of duties.
At the very least, a perspective vendor needs prove how their product will enable your organization to define and enforce roles within the network (both forward facing roles and backend roles), while allowing the opportunity to adjust or otherwise alter on the fly.
Protecting data as it is accessed from an internal or external source can also include DLP (Data Loss Prevention) measures. When combined with IdM solutions, a solid DLP offering can help the organization locate sensitive data that may need protecting, while blocking protected data from unauthorized access and distribution.
When it comes to DLP, Symantec is one of the largest vendors. However, RSA and McAfee offer DLP as well, and they’re just as established.
Any organization examining a DLP offering must judge how well it inspects SSL traffic, and how it detects (inspects) network traffic for exfiltration methods, such as leveraging a networked Multi-Function Printer to fax sensitive data.
IdM solutions can come from a number of places. Each vendor below offers some level of IdM.
Cyber-Ark has three different tools that will help protect data. The first is its Sensitive Information Management suite, followed by the Privileged Session Management and Privileged Identity Management suites.
Cloud Passage’s major offering is called Halo, and it’s a SaaS product. It covers IdM needs in addition to other things, including logging and change management.
BeyondTrust is a well-known privilege management company, and its PowerBroker series works with all major platforms. The company has IdM as well as DLP offerings, controllable from a single management panel.
In addition to encryption, IdM, and other layered defenses, there is still more to consider when it comes to implementing cloud initiatives and protecting digital assets. For example, each of the various layers will generate logs, lots and lots of logs. Therefore, it’s wise to look into log management.
Log management (SIEM) can be complex and expensive, depending on implementation. The vendor should understand the client organization just as well as the organization understands the solution under consideration. All the major vendors (Symantec, RSA, HP) offer SIEM in some fashion. However, consider products from Alert Logic, LogRythm, LogLogic, Trustwave, and eIQ.
If applications access any data at any point of the customer’s experience, then the data and the application must be protected. There are WebApplication Firewalls for that (WhiteHat Security, Barracuda, Dell), as well as several other solutions such as external code reviews (IOActive, Armorize).
Gearing up for a cloud deployment can (and should) lead to an examination of Incident Response plans. Depending on the type of cloud deployment being considered, a separate IR plan may be needed. This determination can only come from reviewing existing IR and business continuity plans, so it’s worth the time and effort.
“The key thing when you start talking about private cloud or whoever, is making sure that in whatever contract you have, you one: have a right to audit; and two: that the vendor or provider has an obligation to respond in the event of a declared incident,” Fisher said.
When you call a vendor with normal tech support needs, he added, you’re going to get some level of priority: “But you need to make sure that your contract and your SLA reflects the fact that if you call them and give a secret word (for example: ‘security meltdown in progress’) that it’s a different SLA in place, and they’re going to provide the resources and the cooperation necessary for you to respond to it.”
It’s possible, with a good deal of effort and hard work, to secure your organization’s cloud deployments. Just remember to do as much research as possible into a given vendor’s offerings. How they compare to others in the same space? How well to the offerings align with the organizations current needs and expectations, and how will they scale as the business grows or shrinks?
In the end, you may moving away from the traditional roles of vendor/client, towards a deeper, stable partnership that could prove far more beneficial in the long run.
Image: Tatiana Popova/Shutterstock.com