For a long time in computer security, we have been focused on protecting workstations, and rightly so. Viruses, worms, remote access Trojans, and other malware has targeted the end-user workstation, and unfortunately, the attacks continue to be quite successful. A number of recent high-profile data leaks have occurred using workstations as the initial point of attack.
However, a point of attack in several other high-profile data leaks have involved attacks on web servers. Citigroup, Barracuda, and now Pacific Northwest National Laboratory (PNNL) were attacked through web servers. This makes me a bit nervous -- I do like to make sure a public-facing web server is hardened and running software that is fully-patched, but there are several techniques attackers can use to find and take advantage of any holes in the server.
One of the problems that I saw disclosed today, CVE 2011-2688, involves a SQL injection attack against the mod_authnz_external module, an Apache authentication module. It is worrisome that a well-known attack is successful on this security-critical component that may be in use on many web servers. Many other attacks, including parameter tampering,
Web servers and the web applications running under them are proving to be all too vulnerable. With high-value data accessible in a web server, such as customer accounts at an online banking website, any exploitable vulnerability in the web server or web application can result in significant loss. As the events at PNNL illustrated, even a web server that may not be high-value can still be an entry point for an attacker into more valuable networks and systems.
It seems that web servers need backstops. We need to be able to filter and/or monitor requests coming into a web server, and to filter and/or monitor data returned by a web server. And, we need to be able to do this in the cloud with web servers that automatically scale. Something to think about.
Thursday, July 21, 2011
Wednesday, July 6, 2011
Cloud Computing and the Insider Threat
Something that hasn't been top-of-mind for me, but remains a threat nonetheless, is that the scope of the "insider threat" changes when the cloud is used for computing and storage.
One of the significant data loss vectors is the "insider threat" where a trusted insider -- either unintentionally or maliciously -- leaks protected information in violation of policy or regulations. In traditional datacenters, the trusted insiders are usually the organization's employees and contractors -- the organization should be able to physically and logically account for every individual that has access to the organization's computers and data. The insider threat is one vector that data loss prevention (DLP) is often deployed to help mitigate.
The situation changes in cloud computing, though. An organization that makes use of cloud computing services, whether SaaS, PaaS, or IaaS, is now using computers and storage that can be accessed by more individuals than just the organization's employees and contractors -- the cloud provider actually owns the servers, networks, and storage and employs personnel and contractors that have administrative access to those components. Now the "insider threat" has suddenly expanded to include a whole new group of people beyond just the original organization's employees.
One mitigation technique used to protect data stored in the cloud from any insider is to encrypt the data. Depending on the operating system used, it may be possible to setup volume encryption or folder encryption on which sensitive data can be securely stored. Unfortunately, encryption key management is not easy -- it seems the best (or only) solution to this problem in the cloud is using a key management server to authenticate and authorize encryption keys, and then configure and monitor the key management server carefully.
Another problem with insiders in the cloud is watching for confidential data in motion. DLP would be a solution to this problem in an organization's datacenter, but the situation is more complex in a cloud environment because of a lack of availability of DLP systems in cloud provider networks and the difficulty of separating individual cloud customer's traffic for DLP analysis. This is a problem we're looking into at Palisade Systems.
One of the significant data loss vectors is the "insider threat" where a trusted insider -- either unintentionally or maliciously -- leaks protected information in violation of policy or regulations. In traditional datacenters, the trusted insiders are usually the organization's employees and contractors -- the organization should be able to physically and logically account for every individual that has access to the organization's computers and data. The insider threat is one vector that data loss prevention (DLP) is often deployed to help mitigate.
The situation changes in cloud computing, though. An organization that makes use of cloud computing services, whether SaaS, PaaS, or IaaS, is now using computers and storage that can be accessed by more individuals than just the organization's employees and contractors -- the cloud provider actually owns the servers, networks, and storage and employs personnel and contractors that have administrative access to those components. Now the "insider threat" has suddenly expanded to include a whole new group of people beyond just the original organization's employees.
One mitigation technique used to protect data stored in the cloud from any insider is to encrypt the data. Depending on the operating system used, it may be possible to setup volume encryption or folder encryption on which sensitive data can be securely stored. Unfortunately, encryption key management is not easy -- it seems the best (or only) solution to this problem in the cloud is using a key management server to authenticate and authorize encryption keys, and then configure and monitor the key management server carefully.
Another problem with insiders in the cloud is watching for confidential data in motion. DLP would be a solution to this problem in an organization's datacenter, but the situation is more complex in a cloud environment because of a lack of availability of DLP systems in cloud provider networks and the difficulty of separating individual cloud customer's traffic for DLP analysis. This is a problem we're looking into at Palisade Systems.
Subscribe to:
Posts (Atom)