As often happens in the computer industry, nomenclature is unwieldy and flexible as technologists, sales & marketing, and the rest of the world clash.
My case in point is the phrase "data loss prevention" or DLP. In other articles, I have talked about DLP as a technology -- in that it is used to analyze the content of a document or message, determine whether the content references a concept confidential or protected in nature, and uses rules or reporting to handle the content. As the concept of DLP was developed in the last decade, the industry struggled to find an appropriate phrase that defined it: phrases including content monitoring & filtering, content analysis, deep packet inspection, and others were used, but the industry and analysts settled on data loss prevention.
Many companies are marketing "data loss prevention" in relation to their technologies, but not in the context of analysis of document content. Instead, their approaches include building a wall around all corporate data (such as on a mobile device, or in a cloud-based document-sharing service), or providing some regular expression matching for message content. This is well and good, but I would suggest these technologies fall under the larger strategy of information protection rather than being specifically about "data loss prevention".
This goes to the heart of the matter: when we build true data loss prevention systems, the intent is to protect confidential information rather than just bits and bytes of raw data. Under the fundamentals of information theory, data is just bits and bytes, but information is found where there is entropy, or value, in the data. This is what distinguishes data loss prevention technology from other data protection technologies, and perhaps the better phrase for the technology would be "information loss protection."
Practically, though, we are probably stuck with the labels that have been adopted. So, I suppose we can accept a variety of technologies under the strategy of data loss prevention, including the technology of data loss prevention itself. Unfortunately, this will continue to be confusing to those inside and outside of the industry and troublesome for sales and marketing.
Monday, November 14, 2011
Friday, August 12, 2011
Five Stages of Cloud Acceptance
Denial: We'll never put anything in the cloud because of security/reliability/performance/etc.
Anger: You already put WHAT in the cloud? How are we going to do backups/switch providers/manage identity/etc.???
Bargaining: OK, we'll move X into the cloud if/when the cloud becomes secure/reliable/etc.
Depression: The CFO/CEO/etc. wants us to start using cloud to save money/reduce costs/expand functionality. We can't use the cloud. What about my job running the data center? What about our bandwidth? What about PCI DSS/HIPAA/GLBA?
Acceptance: It works, I can do more to enable my company's business, and reduce capital expenditures. Let's put everything into the cloud!
Seriously, I have had some of these reactions myself. I hear some of these reactions from people when we talk about using cloud services and realize there truly is a road to acceptance for many people.
Anger: You already put WHAT in the cloud? How are we going to do backups/switch providers/manage identity/etc.???
Bargaining: OK, we'll move X into the cloud if/when the cloud becomes secure/reliable/etc.
Depression: The CFO/CEO/etc. wants us to start using cloud to save money/reduce costs/expand functionality. We can't use the cloud. What about my job running the data center? What about our bandwidth? What about PCI DSS/HIPAA/GLBA?
Acceptance: It works, I can do more to enable my company's business, and reduce capital expenditures. Let's put everything into the cloud!
Seriously, I have had some of these reactions myself. I hear some of these reactions from people when we talk about using cloud services and realize there truly is a road to acceptance for many people.
Changing Face of "Spam" Email
As a network engineer involved in bringing up some of the first Internet connections in the upper midwest in the late 1980s and early 1990s, I also managed email systems in the 1990s as spam email started becoming a nuisance. In the past decade, spam has been more than a nuisance - email systems must have effective spam filters to keep email usable for end users.
There is an interesting trend I see now - I am getting a fair bit of relevant business-related marketing email in my inbox. The amount of "online pharmacy" spam is way down, but I still get a fair amount of complete junk, including a lot of Cyrillic and Mandarin spam that is completely unintelligible to me. Fortunately, my company's spam filter, including up-to-date SpamAssassin rule lists and a good blacklist, are doing a good job discarding and classifying the useless spam, while allowing through the reasonable marketing queries (I think).
A few years back, the sales team at my employer emailed potential customers asking if they could setup meetings to introduce the company's software - not an unusual email message, especially nowadays. One particular recipient hit the roof and replied with a rant worthy of a response to the first massive Usenet spam from the green card lawyers back in the day.
Are people's attitudes changing about spam? Is there an increasing acceptance of reasonable marketing-type contact via email?
There is an interesting trend I see now - I am getting a fair bit of relevant business-related marketing email in my inbox. The amount of "online pharmacy" spam is way down, but I still get a fair amount of complete junk, including a lot of Cyrillic and Mandarin spam that is completely unintelligible to me. Fortunately, my company's spam filter, including up-to-date SpamAssassin rule lists and a good blacklist, are doing a good job discarding and classifying the useless spam, while allowing through the reasonable marketing queries (I think).
A few years back, the sales team at my employer emailed potential customers asking if they could setup meetings to introduce the company's software - not an unusual email message, especially nowadays. One particular recipient hit the roof and replied with a rant worthy of a response to the first massive Usenet spam from the green card lawyers back in the day.
Are people's attitudes changing about spam? Is there an increasing acceptance of reasonable marketing-type contact via email?
Thursday, August 11, 2011
Security Technology Musings
Each security technology that comes along has its set of "use cases" -- that is, it improves confidentiality, integrity, or availability for certain uses. Trying to apply that security technology outside of its useful situations results in either a false sense of security or complete failure.
For example, full disk encryption is a useful security technology intended to keep the entire contents of a disk drive relatively safe from an attacker who might steal the physical disk drive (or the system in which it is installed, such as a laptop). However, when the computer is in operation, full disk encryption has nothing to do with whether files can be accessed -- that is the function of the access control technology built into the operating system.
When we began building Data Loss Prevention (DLP) some years ago, my idea was that content analysis (looking at the textual content of a document) was a powerful way to determine whether a document should be shared outside of an organization. However, the documents that would be visible to the DLP system for analysis would depend on a number of factors: logical placement of the DLP functionality in an organization's computing system, whether the DLP system would be able to see documents as plaintext, and how an adversary might try to circumvent the system.
As we have further developed DLP technology and the industry has settled on standard implementations (data-in-motion, data-at-rest, data-at-use), customers have become comfortable with the functionality and capability of DLP systems. We're finding that DLP is a very useful tool -- helping significantly reduce exposure of confidential information, and improving standing in risk & compliance audits -- for our customers. It's become one part of the security management arsenal.
For example, full disk encryption is a useful security technology intended to keep the entire contents of a disk drive relatively safe from an attacker who might steal the physical disk drive (or the system in which it is installed, such as a laptop). However, when the computer is in operation, full disk encryption has nothing to do with whether files can be accessed -- that is the function of the access control technology built into the operating system.
When we began building Data Loss Prevention (DLP) some years ago, my idea was that content analysis (looking at the textual content of a document) was a powerful way to determine whether a document should be shared outside of an organization. However, the documents that would be visible to the DLP system for analysis would depend on a number of factors: logical placement of the DLP functionality in an organization's computing system, whether the DLP system would be able to see documents as plaintext, and how an adversary might try to circumvent the system.
As we have further developed DLP technology and the industry has settled on standard implementations (data-in-motion, data-at-rest, data-at-use), customers have become comfortable with the functionality and capability of DLP systems. We're finding that DLP is a very useful tool -- helping significantly reduce exposure of confidential information, and improving standing in risk & compliance audits -- for our customers. It's become one part of the security management arsenal.
Friday, August 5, 2011
Are Anti-Virus and a Firewall Enough?
I thought after all the commotion from the many significant data breaches of the past several months that data security would be top-of-mind at nearly every company. Perhaps people outside the information security industry have become tired of the breach news, or perhaps the lesson didn't sink in. Maybe more likely is the idea that "we haven't been hit yet, so we don't need more security yet."
Computer viruses were such a big problem in the late 80's and 90's (and still today) that companies became accustomed to buying anti-virus software.
The Internet was such a wild and wooly place that companies didn't dare connect their LANs to the 'net without a firewall of some sort to keep the outside world from instantly pwning everything.
People in the information security industry know these two main tools, anti-virus and firewalls, have significant limitations. Anti-virus tools have limited effectiveness in the era of morphing malware. Firewalls often are configured to allow HTTP/HTTPS (web traffic) and SMTP (email traffic) without any limits, and everyone always has browsers and email clients running. The result is that attackers have a fairly easy time exploiting problems with browsers, email programs, and the users themselves.
Today, organizations need deeper defenses to handle the problems. Intrusion Detection Systems (IDS/IPS), Data Loss Prevention (DLP), patch management, web filter, and Security Information & Event Management (SIEM) are the important systems to have in place in addition to firewalls and anti-virus.
Web servers need to have a Web Application Firewall (WAF) in front of them to protect against attacks on the applications running on the web servers. If you have a good hosting provider for your web server, you may already have a WAF protecting your web server.
If you don't have these systems in place, you can prioritize based on an analysis of your organization's risks.
Computer viruses were such a big problem in the late 80's and 90's (and still today) that companies became accustomed to buying anti-virus software.
The Internet was such a wild and wooly place that companies didn't dare connect their LANs to the 'net without a firewall of some sort to keep the outside world from instantly pwning everything.
People in the information security industry know these two main tools, anti-virus and firewalls, have significant limitations. Anti-virus tools have limited effectiveness in the era of morphing malware. Firewalls often are configured to allow HTTP/HTTPS (web traffic) and SMTP (email traffic) without any limits, and everyone always has browsers and email clients running. The result is that attackers have a fairly easy time exploiting problems with browsers, email programs, and the users themselves.
Today, organizations need deeper defenses to handle the problems. Intrusion Detection Systems (IDS/IPS), Data Loss Prevention (DLP), patch management, web filter, and Security Information & Event Management (SIEM) are the important systems to have in place in addition to firewalls and anti-virus.
Web servers need to have a Web Application Firewall (WAF) in front of them to protect against attacks on the applications running on the web servers. If you have a good hosting provider for your web server, you may already have a WAF protecting your web server.
If you don't have these systems in place, you can prioritize based on an analysis of your organization's risks.
Thursday, July 21, 2011
Web Servers as an Attack Vector
For a long time in computer security, we have been focused on protecting workstations, and rightly so. Viruses, worms, remote access Trojans, and other malware has targeted the end-user workstation, and unfortunately, the attacks continue to be quite successful. A number of recent high-profile data leaks have occurred using workstations as the initial point of attack.
However, a point of attack in several other high-profile data leaks have involved attacks on web servers. Citigroup, Barracuda, and now Pacific Northwest National Laboratory (PNNL) were attacked through web servers. This makes me a bit nervous -- I do like to make sure a public-facing web server is hardened and running software that is fully-patched, but there are several techniques attackers can use to find and take advantage of any holes in the server.
One of the problems that I saw disclosed today, CVE 2011-2688, involves a SQL injection attack against the mod_authnz_external module, an Apache authentication module. It is worrisome that a well-known attack is successful on this security-critical component that may be in use on many web servers. Many other attacks, including parameter tampering,
Web servers and the web applications running under them are proving to be all too vulnerable. With high-value data accessible in a web server, such as customer accounts at an online banking website, any exploitable vulnerability in the web server or web application can result in significant loss. As the events at PNNL illustrated, even a web server that may not be high-value can still be an entry point for an attacker into more valuable networks and systems.
It seems that web servers need backstops. We need to be able to filter and/or monitor requests coming into a web server, and to filter and/or monitor data returned by a web server. And, we need to be able to do this in the cloud with web servers that automatically scale. Something to think about.
However, a point of attack in several other high-profile data leaks have involved attacks on web servers. Citigroup, Barracuda, and now Pacific Northwest National Laboratory (PNNL) were attacked through web servers. This makes me a bit nervous -- I do like to make sure a public-facing web server is hardened and running software that is fully-patched, but there are several techniques attackers can use to find and take advantage of any holes in the server.
One of the problems that I saw disclosed today, CVE 2011-2688, involves a SQL injection attack against the mod_authnz_external module, an Apache authentication module. It is worrisome that a well-known attack is successful on this security-critical component that may be in use on many web servers. Many other attacks, including parameter tampering,
Web servers and the web applications running under them are proving to be all too vulnerable. With high-value data accessible in a web server, such as customer accounts at an online banking website, any exploitable vulnerability in the web server or web application can result in significant loss. As the events at PNNL illustrated, even a web server that may not be high-value can still be an entry point for an attacker into more valuable networks and systems.
It seems that web servers need backstops. We need to be able to filter and/or monitor requests coming into a web server, and to filter and/or monitor data returned by a web server. And, we need to be able to do this in the cloud with web servers that automatically scale. Something to think about.
Wednesday, July 6, 2011
Cloud Computing and the Insider Threat
Something that hasn't been top-of-mind for me, but remains a threat nonetheless, is that the scope of the "insider threat" changes when the cloud is used for computing and storage.
One of the significant data loss vectors is the "insider threat" where a trusted insider -- either unintentionally or maliciously -- leaks protected information in violation of policy or regulations. In traditional datacenters, the trusted insiders are usually the organization's employees and contractors -- the organization should be able to physically and logically account for every individual that has access to the organization's computers and data. The insider threat is one vector that data loss prevention (DLP) is often deployed to help mitigate.
The situation changes in cloud computing, though. An organization that makes use of cloud computing services, whether SaaS, PaaS, or IaaS, is now using computers and storage that can be accessed by more individuals than just the organization's employees and contractors -- the cloud provider actually owns the servers, networks, and storage and employs personnel and contractors that have administrative access to those components. Now the "insider threat" has suddenly expanded to include a whole new group of people beyond just the original organization's employees.
One mitigation technique used to protect data stored in the cloud from any insider is to encrypt the data. Depending on the operating system used, it may be possible to setup volume encryption or folder encryption on which sensitive data can be securely stored. Unfortunately, encryption key management is not easy -- it seems the best (or only) solution to this problem in the cloud is using a key management server to authenticate and authorize encryption keys, and then configure and monitor the key management server carefully.
Another problem with insiders in the cloud is watching for confidential data in motion. DLP would be a solution to this problem in an organization's datacenter, but the situation is more complex in a cloud environment because of a lack of availability of DLP systems in cloud provider networks and the difficulty of separating individual cloud customer's traffic for DLP analysis. This is a problem we're looking into at Palisade Systems.
One of the significant data loss vectors is the "insider threat" where a trusted insider -- either unintentionally or maliciously -- leaks protected information in violation of policy or regulations. In traditional datacenters, the trusted insiders are usually the organization's employees and contractors -- the organization should be able to physically and logically account for every individual that has access to the organization's computers and data. The insider threat is one vector that data loss prevention (DLP) is often deployed to help mitigate.
The situation changes in cloud computing, though. An organization that makes use of cloud computing services, whether SaaS, PaaS, or IaaS, is now using computers and storage that can be accessed by more individuals than just the organization's employees and contractors -- the cloud provider actually owns the servers, networks, and storage and employs personnel and contractors that have administrative access to those components. Now the "insider threat" has suddenly expanded to include a whole new group of people beyond just the original organization's employees.
One mitigation technique used to protect data stored in the cloud from any insider is to encrypt the data. Depending on the operating system used, it may be possible to setup volume encryption or folder encryption on which sensitive data can be securely stored. Unfortunately, encryption key management is not easy -- it seems the best (or only) solution to this problem in the cloud is using a key management server to authenticate and authorize encryption keys, and then configure and monitor the key management server carefully.
Another problem with insiders in the cloud is watching for confidential data in motion. DLP would be a solution to this problem in an organization's datacenter, but the situation is more complex in a cloud environment because of a lack of availability of DLP systems in cloud provider networks and the difficulty of separating individual cloud customer's traffic for DLP analysis. This is a problem we're looking into at Palisade Systems.
Subscribe to:
Comments (Atom)