Over time, organizations have grappled with shifts in security resulting from changes in where and how data is stored. As PCs entered the workplace, we changed from storing data in a monitored and managed data center to a distributed, networked environment. We are still dealing with effects of this change as we implement policy and controls to identify, manage/protect, and monitor data stored on workstations and servers.
In an interesting perspective, Alan Shimel reported on an idea that General Alexander has proposed to use cloud computing to re-take control of U.S. military data by centralizing it in a well-managed cloud. The essence of the approach is to store data in the cloud and use thin clients to access the data. By implementing strong monitoring and controls in the cloud, data can be protected yet accessed wherever and whenever needed.
One of my concerns with cloud computing has been that users can sign up for cloud services and store confidential data anywhere in the world without having gone through the vetting necessary for corporate services (legal, governance & compliance, data lifecycle management, loss protection, etc). If it is conceivable for an organization (especially the size of the U.S. military) to create a single, central cloud environment and manage access & data to that cloud, then that turns my perspective of cloud computing from being fraught with dangers of data loss into one where the cloud can enable strong data security.
Tuesday, March 29, 2011
Tuesday, March 22, 2011
Data Theft as a Business
I had the pleasure of sitting in on Kevin Poulsen's session at RSA 2011: From White Hat to Black - The Curious Case of Cybercrime Kingpin, Max Vision. I also need to read his book, Kingpin to dig even further into this very interesting story.
A brief recap of the story: after a stint in the joint, Max Vision needed a way to make money to live. Unable to get steady, good-paying work in spite of his skills, he hooked up with a fellow that bankrolled Max's equipment and space needs. Max used borrowed/stolen WiFi access to break into point-of-sale systems (among other things) and steal credit card data. In a twist of irony, Max also hacked criminal credit card sharing sites and stole fresh credit card data from other criminals. Selling this card info, and selling forged cards created using card data, netted Max and his partner significant sums of money.
A significant point in the story about Max Vision is how the cyber criminal underground has developed and how the economics of data theft have become profitable.
This is just one significant example of data theft; other theft continues, including theft of money from bank customer accounts, skimming at ATMs, and the recently-disclosed theft of something (exactly what is still secret) from RSA itself.
Many experts acknowledge it's not a question of if, but when, data loss could happen. Criminals motivated by economic factors have become a significant threat, and this is even more reason to implement technologies and policies like access management to reduce exposure, discovery & endpoint protection to catalog and protect data, and access logging & data loss prevention to control and monitor use of data.
A brief recap of the story: after a stint in the joint, Max Vision needed a way to make money to live. Unable to get steady, good-paying work in spite of his skills, he hooked up with a fellow that bankrolled Max's equipment and space needs. Max used borrowed/stolen WiFi access to break into point-of-sale systems (among other things) and steal credit card data. In a twist of irony, Max also hacked criminal credit card sharing sites and stole fresh credit card data from other criminals. Selling this card info, and selling forged cards created using card data, netted Max and his partner significant sums of money.
A significant point in the story about Max Vision is how the cyber criminal underground has developed and how the economics of data theft have become profitable.
This is just one significant example of data theft; other theft continues, including theft of money from bank customer accounts, skimming at ATMs, and the recently-disclosed theft of something (exactly what is still secret) from RSA itself.
Many experts acknowledge it's not a question of if, but when, data loss could happen. Criminals motivated by economic factors have become a significant threat, and this is even more reason to implement technologies and policies like access management to reduce exposure, discovery & endpoint protection to catalog and protect data, and access logging & data loss prevention to control and monitor use of data.
Thursday, March 17, 2011
Cloud Computing and Data Loss Prevention Implementation
I have been studying the Cloud Security Alliance's Security Guidance and other resources for the past few weeks with a focus on placement of Data Loss Prevention (DLP) capabilities.
At this time, it seems that the typical position for DLP in public clouds is in conjunction with other resources in private or public Infrastructure as a Service (IaaS). Data-in-motion DLP systems in a cloud can be positioned logically adjacent to servers deployed in a cloud to monitor and protect information on those servers. Data-at-rest and data-in-use DLP agents can be deployed on servers in the cloud to catalog and protect data on those servers. However, there is nothing significantly new or better in these DLP implementation approaches than is currently available traditional servers.
What would be useful in cloud implementations is an API or specification to allow DLP interaction in all three major service models: Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS). VMware's vShield family of products look like a step in this direction for the IaaS model: the vShield endpoint product looks like it potentially could enable data-at-rest DLP, but the network-oriented vShield products do not appear to provide direct access to network data streams to enable data-in-motion DLP.
I am looking forward to engaging with cloud computing vendors to see if we can create a generalized specification for access to cloud systems for DLP management and remove some of the fuzzy haze enveloping the data.
Updated: Christopher Hoff blogged about the lack of security functionality in cloud services (IDS/IDP, WAF, DLP, etc) about a year and a half ago. Do we dare hope that cloud providers are becoming any more interested in security than they were then?
At this time, it seems that the typical position for DLP in public clouds is in conjunction with other resources in private or public Infrastructure as a Service (IaaS). Data-in-motion DLP systems in a cloud can be positioned logically adjacent to servers deployed in a cloud to monitor and protect information on those servers. Data-at-rest and data-in-use DLP agents can be deployed on servers in the cloud to catalog and protect data on those servers. However, there is nothing significantly new or better in these DLP implementation approaches than is currently available traditional servers.
What would be useful in cloud implementations is an API or specification to allow DLP interaction in all three major service models: Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS). VMware's vShield family of products look like a step in this direction for the IaaS model: the vShield endpoint product looks like it potentially could enable data-at-rest DLP, but the network-oriented vShield products do not appear to provide direct access to network data streams to enable data-in-motion DLP.
I am looking forward to engaging with cloud computing vendors to see if we can create a generalized specification for access to cloud systems for DLP management and remove some of the fuzzy haze enveloping the data.
Updated: Christopher Hoff blogged about the lack of security functionality in cloud services (IDS/IDP, WAF, DLP, etc) about a year and a half ago. Do we dare hope that cloud providers are becoming any more interested in security than they were then?
Tuesday, March 15, 2011
My PC in the Cloud
Something I have been thinking about for a couple of years now is when or whether I'll be able to have everything I have today on my personal PC in the cloud.
My vision is to have all of my applications and data instantly available no matter where I go. I'm not talking about GotoMyPC or Back To My Mac. I want my computer hosted in a reliable cloud so I don't have to manage my own backups, worry about my own hardware (and its failures), manage my RAM and hard disk space dynamically, and have snapshots. Basically, I want all the benefits of the cloud that I can get for corporate infrastructure, but on a personal / family basis. I want the ease of access from thin clients (netbooks, tablets, and anything else) from anywhere on the Internet.
I realize using my applications like video editing may not work so well in a cloud environment, and now that I have made the switch to Mac, there are licensing issues as well for the operating system I would like to use. I know many of the applications I use on my PC are available as SaaS offerings, but I don't want to get into the situation where all of my various forms of data are locked up in a variety of SaaS offerings hosted by a wide variety of companies, making it difficult to manage all the different access controls and account for all the different kinds of data stored in different places.
Security and reliability would become major factors for me if my PC were in the cloud. I depend on firewalls, access controls, and strong passwords to secure my personal systems. In the cloud, I would also need the assurance of privacy, encryption, audit trail, jurisdiction control, portability (if I were to need to change providers), and data lifecycle management -- everything about which I have concerns from a business perspective, but am even more worried about when it comes to my personal data.
My vision is to have all of my applications and data instantly available no matter where I go. I'm not talking about GotoMyPC or Back To My Mac. I want my computer hosted in a reliable cloud so I don't have to manage my own backups, worry about my own hardware (and its failures), manage my RAM and hard disk space dynamically, and have snapshots. Basically, I want all the benefits of the cloud that I can get for corporate infrastructure, but on a personal / family basis. I want the ease of access from thin clients (netbooks, tablets, and anything else) from anywhere on the Internet.
I realize using my applications like video editing may not work so well in a cloud environment, and now that I have made the switch to Mac, there are licensing issues as well for the operating system I would like to use. I know many of the applications I use on my PC are available as SaaS offerings, but I don't want to get into the situation where all of my various forms of data are locked up in a variety of SaaS offerings hosted by a wide variety of companies, making it difficult to manage all the different access controls and account for all the different kinds of data stored in different places.
Security and reliability would become major factors for me if my PC were in the cloud. I depend on firewalls, access controls, and strong passwords to secure my personal systems. In the cloud, I would also need the assurance of privacy, encryption, audit trail, jurisdiction control, portability (if I were to need to change providers), and data lifecycle management -- everything about which I have concerns from a business perspective, but am even more worried about when it comes to my personal data.
Saturday, March 12, 2011
Virtualization and Security
As I am planning to lead my MIS445 class into virtualization and cloud computing in the coming weeks, I am pulling together a list of the fundamental technologies that have come together to enable cloud computing:
- High-Speed Internet
- Public Key Cryptography
- Commodity multi-core CPUs
- Storage-Area Networks
- High-Performance Virtual Machine Hypervisors
- Virtual LANs
- Virtual Private Networks
Monday, March 7, 2011
101 Ways to Pwn a Network: DHCP
The past few class periods, as I have been teaching my MIS445 Networks & Security class about TCP, IP, DHCP, DNS, and routing, I have been digging into some of the threats at these different levels of the network.
Of course, I have mentioned things like DNS cache attacks, but the attack that really generated a lot of discussion was rogue DHCP servers. This attack requires insider access to a network (not hard on an unsecured wireless network or an open university network), but it really makes life difficult for the network administrator when these things pop up.
More often than not, rogue DHCP servers are not maliciously placed in a network. But what about a DHCP server that is maliciously added to a network to pwn all the outbound traffic? Devices like Pwn Plug with the addition of a DHCP server and a passive traffic capture capability would be a heck of a way to listen in on interesting conversations in a network.
When talking to my class, I mentioned tools that can help find rogue DHCP servers. I have heard of dhcpfind and dhcpexplorer, but I haven't used those tools before. It would be nice to find a tool that runs under Linux or MacOS.
Of course, I have mentioned things like DNS cache attacks, but the attack that really generated a lot of discussion was rogue DHCP servers. This attack requires insider access to a network (not hard on an unsecured wireless network or an open university network), but it really makes life difficult for the network administrator when these things pop up.
More often than not, rogue DHCP servers are not maliciously placed in a network. But what about a DHCP server that is maliciously added to a network to pwn all the outbound traffic? Devices like Pwn Plug with the addition of a DHCP server and a passive traffic capture capability would be a heck of a way to listen in on interesting conversations in a network.
When talking to my class, I mentioned tools that can help find rogue DHCP servers. I have heard of dhcpfind and dhcpexplorer, but I haven't used those tools before. It would be nice to find a tool that runs under Linux or MacOS.
Friday, March 4, 2011
Tagging Your Data
If you are protecting sensitive, unstructured data that doesn't follow well-defined formats like personal health information or personal financial information, a common approach for Data Loss Protection (DLP) systems is to create fingerprints of the data and then check the fingerprints against outbound data or data stored on workstations and servers.
However, fingerprinting tends to require some active effort on the part of individuals and administrators to make sure the appropriate data has always been fingerprinted. This involves steps that it seems the typical end-user or administrator doesn't always have time to perform.
I have seen a couple of technologies that could help with this situation. They involve a simple pop-up dialog when a user saves or emails a document, and the user is quickly asked whether the data is sensitive. If so, the document or email is tagged with an appropriate label or watermark, and subsequent use or transmission of the document or email can be tracked by DLP systems.
I am intrigued by this approach for a couple of reasons. First, it helps keep data appropriately tagged (assuming a compliant user base, which works in the common case -- people generally want to do the right thing). Second, it involves end users in the decisions about what data is sensitive, and helps keep users aware of the security implications of their work.
I am interested to get feedback on how others feel about this approach.
However, fingerprinting tends to require some active effort on the part of individuals and administrators to make sure the appropriate data has always been fingerprinted. This involves steps that it seems the typical end-user or administrator doesn't always have time to perform.
I have seen a couple of technologies that could help with this situation. They involve a simple pop-up dialog when a user saves or emails a document, and the user is quickly asked whether the data is sensitive. If so, the document or email is tagged with an appropriate label or watermark, and subsequent use or transmission of the document or email can be tracked by DLP systems.
I am intrigued by this approach for a couple of reasons. First, it helps keep data appropriately tagged (assuming a compliant user base, which works in the common case -- people generally want to do the right thing). Second, it involves end users in the decisions about what data is sensitive, and helps keep users aware of the security implications of their work.
I am interested to get feedback on how others feel about this approach.
Wednesday, March 2, 2011
Losing Your Data in the Cloud
In a high-profile event Monday, Google Mail disappeared for a large number of users. Google has since recovered the missing email from backups and blamed the issue on a software problem.
This is a great learning moment: if your data is stored in the cloud, does the provider adequately backup the data? In this case, even though Google keeps multiple copies of data, the software problem caused all online copies of the data to be lost. Fortunately for its users, Google had offline copies of the data and was able to restore from backups.
As a Google Mail user, I was not even aware of the policies and procedures Google has for managing the data in the Google Mail system. I, as millions of others, had signed up for Google Mail because of the free service with lots of storage. I have stayed with Google Mail because of the price and functionality, and it has been reliable for the many years I have used it.
The due diligence and risk management issues are being documented as part of cloud provider analysis best practices strategies by the Cloud Security Alliance. As we've just seen, it is important to evaluate and know a cloud provider's security and reliability plans and practices before trusting important data to that provider.
This is a great learning moment: if your data is stored in the cloud, does the provider adequately backup the data? In this case, even though Google keeps multiple copies of data, the software problem caused all online copies of the data to be lost. Fortunately for its users, Google had offline copies of the data and was able to restore from backups.
As a Google Mail user, I was not even aware of the policies and procedures Google has for managing the data in the Google Mail system. I, as millions of others, had signed up for Google Mail because of the free service with lots of storage. I have stayed with Google Mail because of the price and functionality, and it has been reliable for the many years I have used it.
The due diligence and risk management issues are being documented as part of cloud provider analysis best practices strategies by the Cloud Security Alliance. As we've just seen, it is important to evaluate and know a cloud provider's security and reliability plans and practices before trusting important data to that provider.
Subscribe to:
Posts (Atom)