Wednesday, December 21, 2011

Web Server Security Checklist


Here is a quick checklist of items I have found to be important in securing and monitoring the security of outward-facing web servers.

Architecture

Typical components surrounding a web server include an external firewall to protect the web server itself from attacks and an internal firewall to protect the internal corporate systems in case the web server is breached.

Hardening

Operating systems and web server software packages often come with additional components that may not be necessary. Rather than leaving unused but potentially vulnerable software available on a web server, it is wise to disable and/or remove any unused software.

Ensure directories and files have appropriate access permissions - does the web server process really need read access to the entire filesystem?

Remove default system accounts. Ensure accounts with access to the server have appropriate passwords.

If you have a host-based firewall on the web server, limit access to administrative functions (SSH or remote terminal services). Limit outbound network connections from the server to only necessary sites and/or protocols.

Patching & Updating

It is amazing how many web servers I have found that are running operating systems, web server software, or web applications that are long outdated and likely to have substantial vulnerabilities. It's important to stay abreast of known vulnerabilities and vendor patches, and have a working plan to evaluate, apply, and test patches for all the software on the web server as well as the other servers and network devices associated with the web server.

I subscribe to the SANS @Risk Consensus Security Alert mail list to stay informed of vulnerabilities and patches in major operating systems and applications.

Web application firewall

Even the best-run and maintained systems can have latent vulnerabilities hiding in the software and/or configuration. Web application firewalls can help protect against attacks such as SQL injection which are otherwise all too commonly successful.

I have made use of the mod_security Apache plug-in module and rules to protect web servers. Many commercial web application firewall devices are available, and even cloud-based web application firewall services are available.

Penetration testing

It's not a bad idea to have a third-party check your web site using penetration testing techniques to check for potential network, operating system, web server, and web application vulnerabilities and mis-configurations.

I have not used it, but I understand the BackTrack bootable Linux CD provides a nice collection of tools to perform penetration testing. Otherwise, there seems to be quite a few consultants willing to perform penetration testing.

Log monitoring

Of course, a busy web site can generate a large amount of log data every day. Tools like awstats can be useful to build an understanding of typical usage loads, top pages, and user demographics.

I have also found looking at failed requests (4xx responses) to be interesting because one can see what approaches attackers are using against web sites, and can help make sure that defenses are working properly.

Auditing

If a system seems to be running OK, why bother looking for trouble?

  • What if you have outdated administrative accounts, some of which probably have poor passwords?
  • What if a piece of software was installed at some point that unexpected opened access permissions in the filesystem?
  • What if, during a hasty period of diagnosing and resolving a significant issue, permissions were changed in the filesystem or in the web server configuration and never were restored?
  • Or, what if an attacker has gained access to the server and is siphoning data into a hidden directory for later download?

Make time to audit your web server regularly and look for unexpected changes in files, permissions, or access, check logs, and verify installed software and patches.

Data loss monitoring and prevention

Data loss monitoring and prevention systems should have a place in high-stakes web services. These systems can monitor the type and quantity of data that is coming out of a web server or the database, and raise alerts or block results that violate rules. These systems can be put in place either in front of the web server or the database server to monitor requests and responses.

Monday, November 14, 2011

Data Loss Prevention: Technology or Strategy?

As often happens in the computer industry, nomenclature is unwieldy and flexible as technologists, sales & marketing, and the rest of the world clash.

My case in point is the phrase "data loss prevention" or DLP. In other articles, I have talked about DLP as a technology -- in that it is used to analyze the content of a document or message, determine whether the content references a concept confidential or protected in nature, and uses rules or reporting to handle the content. As the concept of DLP was developed in the last decade, the industry struggled to find an appropriate phrase that defined it: phrases including content monitoring & filtering, content analysis, deep packet inspection, and others were used, but the industry and analysts settled on data loss prevention.

Many companies are marketing "data loss prevention" in relation to their technologies, but not in the context of analysis of document content. Instead, their approaches include building a wall around all corporate data (such as on a mobile device, or in a cloud-based document-sharing service), or providing some regular expression matching for message content. This is well and good, but I would suggest these technologies fall under the larger strategy of information protection rather than being specifically about "data loss prevention".

This goes to the heart of the matter: when we build true data loss prevention systems, the intent is to protect confidential information rather than just bits and bytes of raw data. Under the fundamentals of information theory, data is just bits and bytes, but information is found where there is entropy, or value, in the data. This is what distinguishes data loss prevention technology from other data protection technologies, and perhaps the better phrase for the technology would be "information loss protection."

Practically, though, we are probably stuck with the labels that have been adopted. So, I suppose we can accept a variety of technologies under the strategy of data loss prevention, including the technology of data loss prevention itself. Unfortunately, this will continue to be confusing to those inside and outside of the industry and troublesome for sales and marketing.

Friday, August 12, 2011

Five Stages of Cloud Acceptance

Denial: We'll never put anything in the cloud because of security/reliability/performance/etc.

Anger: You already put WHAT in the cloud? How are we going to do backups/switch providers/manage identity/etc.???

Bargaining: OK, we'll move X into the cloud if/when the cloud becomes secure/reliable/etc.

Depression: The CFO/CEO/etc. wants us to start using cloud to save money/reduce costs/expand functionality. We can't use the cloud. What about my job running the data center? What about our bandwidth? What about PCI DSS/HIPAA/GLBA?

Acceptance: It works, I can do more to enable my company's business, and reduce capital expenditures. Let's put everything into the cloud!

Seriously, I have had some of these reactions myself. I hear some of these reactions from people when we talk about using cloud services and realize there truly is a road to acceptance for many people.

Changing Face of "Spam" Email

As a network engineer involved in bringing up some of the first Internet connections in the upper midwest in the late 1980s and early 1990s, I also managed email systems in the 1990s as spam email started becoming a nuisance. In the past decade, spam has been more than a nuisance - email systems must have effective spam filters to keep email usable for end users.

There is an interesting trend I see now - I am getting a fair bit of relevant business-related marketing email in my inbox. The amount of "online pharmacy" spam is way down, but I still get a fair amount of complete junk, including a lot of Cyrillic and Mandarin spam that is completely unintelligible to me. Fortunately, my company's spam filter, including up-to-date SpamAssassin rule lists and a good blacklist, are doing a good job discarding and classifying the useless spam, while allowing through the reasonable marketing queries (I think).

A few years back, the sales team at my employer emailed potential customers asking if they could setup meetings to introduce the company's software - not an unusual email message, especially nowadays. One particular recipient hit the roof and replied with a rant worthy of a response to the first massive Usenet spam from the green card lawyers back in the day.

Are people's attitudes changing about spam? Is there an increasing acceptance of reasonable marketing-type contact via email?

Thursday, August 11, 2011

Security Technology Musings

Each security technology that comes along has its set of "use cases" -- that is, it improves confidentiality, integrity, or availability for certain uses.  Trying to apply that security technology outside of its useful situations results in either a false sense of security or complete failure.

For example, full disk encryption is a useful security technology intended to keep the entire contents of a disk drive relatively safe from an attacker who might steal the physical disk drive (or the system in which it is installed, such as a laptop).  However, when the computer is in operation, full disk encryption has nothing to do with whether files can be accessed -- that is the function of the access control technology built into the operating system.

When we began building Data Loss Prevention (DLP) some years ago, my idea was that content analysis (looking at the textual content of a document) was a powerful way to determine whether a document should be shared outside of an organization.  However, the documents that would be visible to the DLP system for analysis would depend on a number of factors: logical placement of the DLP functionality in an organization's computing system, whether the DLP system would be able to see documents as plaintext, and how an adversary might try to circumvent the system.

As we have further developed DLP technology and the industry has settled on standard implementations (data-in-motion, data-at-rest, data-at-use), customers have become comfortable with the functionality and capability of DLP systems. We're finding that DLP is a very useful tool -- helping significantly reduce exposure of confidential information, and improving standing in risk & compliance audits -- for our customers. It's become one part of the security management arsenal.

Friday, August 5, 2011

Are Anti-Virus and a Firewall Enough?

I thought after all the commotion from the many significant data breaches of the past several months that data security would be top-of-mind at nearly every company. Perhaps people outside the information security industry have become tired of the breach news, or perhaps the lesson didn't sink in. Maybe more likely is the idea that "we haven't been hit yet, so we don't need more security yet."

Computer viruses were such a big problem in the late 80's and 90's (and still today) that companies became accustomed to buying anti-virus software.

The Internet was such a wild and wooly place that companies didn't dare connect their LANs to the 'net without a firewall of some sort to keep the outside world from instantly pwning everything.

People in the information security industry know these two main tools, anti-virus and firewalls, have significant limitations.  Anti-virus tools have limited effectiveness in the era of morphing malware. Firewalls often are configured to allow HTTP/HTTPS (web traffic) and SMTP (email traffic) without any limits, and everyone always has browsers and email clients running. The result is that attackers have a fairly easy time exploiting problems with browsers, email programs, and the users themselves.

Today, organizations need deeper defenses to handle the problems. Intrusion Detection Systems (IDS/IPS), Data Loss Prevention (DLP), patch management, web filter, and Security Information & Event Management (SIEM) are the important systems to have in place in addition to firewalls and anti-virus.

Web servers need to have a Web Application Firewall (WAF) in front of them to protect against attacks on the applications running on the web servers. If you have a good hosting provider for your web server, you may already have a WAF protecting your web server.

If you don't have these systems in place, you can prioritize based on an analysis of your organization's risks.

Thursday, July 21, 2011

Web Servers as an Attack Vector

For a long time in computer security, we have been focused on protecting workstations, and rightly so.  Viruses, worms, remote access Trojans, and other malware has targeted the end-user workstation, and unfortunately, the attacks continue to be quite successful.  A number of recent high-profile data leaks have occurred using workstations as the initial point of attack.

However, a point of attack in several other high-profile data leaks have involved attacks on web servers.  Citigroup, Barracuda, and now Pacific Northwest National Laboratory (PNNL) were attacked through web servers.  This makes me a bit nervous -- I do like to make sure a public-facing web server is hardened and running software that is fully-patched, but there are several techniques attackers can use to find and take advantage of any holes in the server.

One of the problems that I saw disclosed today, CVE 2011-2688, involves a SQL injection attack against the mod_authnz_external module, an Apache authentication module.  It is worrisome that a well-known attack is successful on this security-critical component that may be in use on many web servers.  Many other attacks, including parameter tampering,

Web servers and the web applications running under them are proving to be all too vulnerable.  With high-value data accessible in a web server, such as customer accounts at an online banking website, any exploitable vulnerability in the web server or web application can result in significant loss. As the events at PNNL illustrated, even a web server that may not be high-value can still be an entry point for an attacker into more valuable networks and systems.

It seems that web servers need backstops.  We need to be able to filter and/or monitor requests coming into a web server, and to filter and/or monitor data returned by a web server.  And, we need to be able to do this in the cloud with web servers that automatically scale.  Something to think about.

Wednesday, July 6, 2011

Cloud Computing and the Insider Threat

Something that hasn't been top-of-mind for me, but remains a threat nonetheless, is that the scope of the "insider threat" changes when the cloud is used for computing and storage.

One of the significant data loss vectors is the "insider threat" where a trusted insider -- either unintentionally or maliciously -- leaks protected information in violation of policy or regulations. In traditional datacenters, the trusted insiders are usually the organization's employees and contractors -- the organization should be able to physically and logically account for every individual that has access to the organization's computers and data.  The insider threat is one vector that data loss prevention (DLP) is often deployed to help mitigate.

The situation changes in cloud computing, though.  An organization that makes use of cloud computing services, whether SaaS, PaaS, or IaaS, is now using computers and storage that can be accessed by more individuals than just the organization's employees and contractors -- the cloud provider actually owns the servers, networks, and storage and employs personnel and contractors that have administrative access to those components.  Now the "insider threat" has suddenly expanded to include a whole new group of people beyond just the original organization's employees.

One mitigation technique used to protect data stored in the cloud from any insider is to encrypt the data.  Depending on the operating system used, it may be possible to setup volume encryption or folder encryption on which sensitive data can be securely stored.  Unfortunately, encryption key management is not easy -- it seems the best (or only) solution to this problem in the cloud is using a key management server to authenticate and authorize encryption keys, and then configure and monitor the key management server carefully.

Another problem with insiders in the cloud is watching for confidential data in motion.  DLP would be a solution to this problem in an organization's datacenter, but the situation is more complex in a cloud environment because of a lack of availability of DLP systems in cloud provider networks and the difficulty of separating individual cloud customer's traffic for DLP analysis.  This is a problem we're looking into at Palisade Systems.

Monday, June 27, 2011

Fully-Functional Data Loss Prevention

Since Data Loss Prevention (DLP) became a known technology in the computer security arena a few years ago, a number of vendors of existing non-DLP security products added basic DLP-like features to enable detection of some common private or confidential information.  However, a complete DLP implementation involves more than just regular expressions to match patterns in text in, say, email messages.

Certainly, email is a significant vector by which data loss occurs.  More generally, the DLP industry terms data traversing the network as Data in Motion.  However, there are many more protocols than just email, not the least of which include web-based email services, such as Google Mail, and social media services, such as Facebook, that could also be data loss vectors.  A complete DLP implementation will likely be able to work with a number of common network protocols to manage Data in Motion.

DLP also manages data in two other important situations, Data in Use and Data at Rest.  Data in Use DLP can manage data used on a workstation, such as monitoring data being copied to a USB flash drive.  Data at Rest DLP can inventory and manage the private and confidential data stored on workstation and server's hard drives.

The ways in which most DLP systems are able to discover protected information extend far beyond basic regular expressions.  Common approaches include pre-packaged sets of terms, database fingerprints, file fingerprints, special code to match data like credit card numbers, and more. I previously wrote an article on Classes of Protected Information and DLP that goes into much more detail on this topic.

In addition to managing protected data in the scenarios of Data in Motion, Use, and Rest, and using multiple approaches to finding protected data, DLP systems also offer sophisticated configuration, reporting, alerting, and case management services.  There may be situations where certain groups of users are allowed to work with certain kinds of confidential information while others are not -- a DLP system might be configured to monitor such information use for the privileged users and block use by other users.  The depth of reporting and alerting capabilities offered by a DLP system can make a DLP installation more useful by providing information ranging from summaries to detailed violation information as needed for management and compliance reports.  Finally, DLP case management tools can enable rolling up multiple incidents into a consolidated case that can be managed as necessary to resolution.

In summary, a DLP system is a significant addition to an organization's data security arsenal.

Tuesday, June 7, 2011

It's 10:00pm - Do You Know Where Your Data Is?

Data can be stored in so many places and be so vulnerable to loss or exposure.  The obvious risk and probability of loss for protected data stored on devices like laptops often motivates security staff to make improvements in this area.  Many people have an "a-ha moment" when they see how Data Loss Prevention (DLP) discovery agents can find and report confidential or protected data stored in unexpected places.

It's good practice to inventory where and how confidential / protected data is stored, create policy that defines where and how such data should be stored, then move towards the goal defined by the policy and monitor progress.   (Helpful side benefits of this process include improving your backup and archive coverage of protected data, reducing duplication of data, and assisting your business continuity planning.)

The initial inventory of protected data can be overwhelming -- data can be dispersed over all the personal workstations and laptops in the entire company and in the oddest nooks and crannies of servers.  But it's good to know where your organization stands with regard to protected data, and what your biggest points of risk might be.  If you found confidential financial data being stored on laptops that don't have disk encryption, maybe that's your prime starting point.  If you found multiple copies of confidential data stored on a server, maybe it's just a matter of consolidating the data and keeping employees better informed about what location to use on the server for that data.

When it comes to writing your protected data storage policies, keep flexibility in mind.  Mobility is a big factor in employee computing use cases today, so if important data on laptops is common, then maybe a disk encryption solutions for laptops is needed rather than disrupting employees' work by requiring them not to keep data on laptops.

When your protected data storage policy is defined, then it's time to move toward it.  Education will be important so employees understand why and how this process is happening.  Some time & effort will be required to implement the changes, and perhaps some new software will be required for encryption.

As progress is made, DLP discovery software can be used to measure and monitor the progress, and watch for significant deviations from the policy that need to be addressed.

Friday, June 3, 2011

Cloud Computing and Protecting Confidential Information

A couple of months ago, I talked about the implementation of DLP in cloud computing environments.  Since then, I have seen a few examples of how security-oriented firms are working with cloud computing vendors, such as Tripwire, enStratus, and others working with cloud vendors to provide internal compliance and validation.

Meanwhile, we have seen several large-scale data breaches, including numerous attacks on Sony, that involve attacks through web servers.

A significant use case for cloud computing is to provide scalable web services, so we have an interesting and significant security intersection between deployments of web servers (often with vulnerabilities) in the cloud, and the need for web application firewall (WAF), data loss prevention (DLP), and intrusion detection/prevention (IDS/IPS) to protect the web servers and the information to which they provide access.

There are some difficult problems with protecting outward-facing cloud-based web servers, though.  It might not be feasible to scale WAF, DLP, and IDS/IPS systems alongside the web servers.  It may be challenging to be able to monitor and/or intercept web traffic -- especially SSL web traffic -- to protect against attacks and data loss.

A solution to this problem might be to incorporate WAF, DLP, and IDS/IPS technology into the web servers themselves, so as the web servers are scaled, the protection automatically scales also.

Wednesday, May 25, 2011

Insidious Insiders: Bank of America

When I talk or write about inappropriate confidential information disclosure, I often point out that data loss prevention (DLP) systems most commonly help reduce the everyday mistakes by well-intentioned employees just trying to do their jobs. A DLP system also helps discover a malicious insider gathering or passing confidential information to outsiders. Regardless of intent, a good DLP system can help administrators notice a trend of confidential leaks and help build a case file for action with regard to a problematic insider.

A story I saw today about a problem at Bank of America that has been under investigation for a while where an apparently-malicious employee, who had access to "personally identifiable information such as names, addresses, Social Security numbers, phone numbers, bank account numbers, driver's license numbers, birth dates, e-mail addresses, family names, PINs and account balances," allegedly passed this information to criminals. The estimated resulting direct financial loss is $10 million.  Indirect losses, including employee time spent investigating the problem, cost of credit report monitoring for affected customers, revisiting policies and controls, and diminished brand may be significant as well.

A DLP system is one of the best practices that a business can put into place to help track and prevent data breach events. If you have a DLP system in place, make sure it is correctly configured, installed in the correct locations in your network, servers, and clients, and make sure it is monitored. (It is highly likely that Bank of America has a DLP system in place, but I do not have any knowledge in regards to whether information from a DLP system helped with the investigation of this case.)


Other best practices for protection of information include:
  • Limiting the amount and scope of information available to employees to that necessary to do their jobs. Often, employees are given increasing access to information over their tenure, and it's a good idea to review access to make sure potential for problems is limited.
  • Logging information access and reviewing the logs for unusual patterns. A Security Event Manager (SEM, also known as SIEM) can help with this by making it possible to centrally manage and review information from servers.
  • Limit network access for workstations and servers. Servers should generally not be using protocols like Internet Relay Chat or accessing random web sites. A network protocol manager or firewall can be configured to prevent unexpected network use. Unexpected use of web sites or network protocols from servers might be indicative of an intrusion that should be investigated.
With good practices and vigilance, you can reduce the risk posed by malicious intent.

Friday, May 20, 2011

Classes of Protected Information and DLP

Data Loss Prevention (DLP) systems have to deal with a variety of formats of data and identify protected data in those formats.  In general, protected information falls into these formats:
  • Unstructured text - as found in text documents - including various types of information:
    • Corporate proprietary information or trade secrets
    • Personal health records
    • Personal financial records
    • Personal identifying information
  • Structured data - as found in spreadsheets, tables, database output, and CSV files
To deal with these different formats of protected information, a variety of approaches are used in a DLP system.

For corporate proprietary information, document fingerprinting is the predominant approach to identifying parts or complete copies of proprietary documents.  This requires the administrator to register proprietary documents with the DLP system, and then the DLP system can match fragments or wholesale copies of the proprietary documents.

Another approach that can be used for proprietary documents is to embed tags in the documents, such as "Company Confidential", and then add a simple rule to the DLP system to watch for that tag.  However, this depends on corporate users applying the correct tags to the documents, and is easy for a malicious insider to circumvent, for example, by simply removing the tag before transmitting the document to an unauthorized recipient.

For data like personal health information (PHI) or personal financial information (PFI), several approaches (or a combination of approaches) are typically used.  A combination of search terms can be used to determine if data contains information referring to a particular individuals or group of individuals, plus whether the data contains significant information about those individuals.  For example, an email message from a bank containing the customer's account number, name, and account balance, it might be considered to be information protected under the Gramm-Leach-Bliley Act (GLBA).

Another approach to PHI and PFI is to use information from a corporate database, such as account numbers and customer names, in the DLP system to search for matches.  If an account number and associated customer name turns up in an email message, the message might be considered to contain information protected under GLBA.

A third approach, specific to personal financial information, is to look for credit card information.  Credit card numbers use a standard format and are assigned in specific ways, so it is possible to look at a sixteen-digit number and determine with a high degree of accuracy whether that number is probably a VISA or MasterCard credit card number.

For personal identifying information, an approach is to look for national identification numbers, state driver's license numbers, or account numbers.  In the United States, the Social Security Number (SSN) is often used (and abused) for purposes of identification and authentication for financial and health purposes, and as such has gained status as a protected piece of information.  Unfortunately, the format of the SSN was developed without the concept of check digits or embedded validators, so it is easy for a DLP system to mistake a number in the form 123-45-6789 as an SSN.

As for structured data, DLP systems can identify protected contents in a couple of ways.  One is to write rules for the DLP system that match the format of data typically used in a company, such as forms that are often used for things like customer orders.  Another approach is to use information from a corporate database, such as account numbers and customer names, in the DLP system to search for matches.

These formats cover the majority of ways I have seen protected information stored and transmitted in ways that DLP systems can help identify and protect the data.

Tuesday, May 17, 2011

Bouncing Through the Cloud

A Bloomberg report over the weekend referenced an unnamed source as saying that Amazon cloud resources were used in the breach of the Sony Playstation Network.  Specifically, Amazon's cloud infrastructure was not compromised, but instead used as a "relay" for the attacker to hide his/her origin.

An article on Reuters makes an (IMO) unsubstantiated claim that the attack on Sony spells doom for cloud computing.  My response is that, whether or not cloud computing had anything to do with this, Sony simply had vulnerable software and apparently had insufficient controls and management in place to detect and respond to security issues.  Poor security and controls are mostly unrelated to cloud technologies -- yes, there is a possibility of attacks on the hypervisor in shared infrastructure, among other things -- but none of the recent significant breaches has involved vulnerabilities in cloud computing.

What I see as a more significant exposure in cloud computing is the extent to which confidential data is being stored in the public or hybrid cloud and being provided via cloud-based servers to end users over the Internet without sufficient monitoring and controls in place.  The glaring security deficiencies in cloud computing right now are the lack of visibility and the lack of security functionality that we have in private data centers, including network traffic analysis, intrusion detection systems (IDS), data loss prevention (DLP) systems, and audit and logging systems.

We're working at Palisade Systems to improve the security controls available in cloud computing. Palisade has virtual DLP appliances available for VMware cloud environments, and will have more good cloud security products coming up.

Wednesday, May 11, 2011

Virtualization and Data Loss

Well, it had to happen to me eventually.  A physical server running VMware ESXi crashed and I lost a set of virtual servers that I had moved to it.

It seemed to result from a power hiccup.  Nearly everything important in the server room is on a UPS, except for this system.

This failure mode was new to me: VMware ESXi would not finish its boot, but complained about an invalid file (sorry, exact filename escapes me) and stopped.  (It looked an awful lot like a Windows boot failure I've seen in the past where a corrupted registry hive file prevented Windows from booting!)  I had to perform a VMware ESXi recovery installation, and that resulted in the ominous warning that one of my filesystems had an invalid partition table.

This particular VMware server has two VMFS filesystems on it (two separate hard drives to improve I/O performance for the VMs), and the second of the two filesystems was toast.

I hadn't considered the virtual machines on this VMware server to be irreplaceable, but they were valuable.  It took a couple of days of work to rebuild one of the lost VMs.  Another of the lost VMs caused a troublesome cascaded failure: it provided an infrequently-used web proxy whose loss caused unexpected software update failures elsewhere, and that took some time to diagnose as well.

In summary: I wish I had enough disk space everywhere to have backups of all the virtual machines, and I wish I had a good way to use apcupsd (or equivalent) to shutdown ESXi servers nicely on power failures.

Wednesday, May 4, 2011

Data Loss Prevention and Mobility

At Palisade we are often asked how to protect data from loss when your employees and/or partners all have access to your corporate private/privileged data through handy little gadgets like iPhones.

The problem we are finding is that gadget vendors have not provided hooks into the devices so we can do DLP on the gadgets directly.  In fact, software on iOS devices is intended to be quite isolated to prevent any application accessing information that belongs to another application, such as email messages or stored PDFs.

Enter some pretty cool software from Whisper Systems for Android systems.  WhisperCore looks very intriguing:
WhisperCore integrates with the underlying Android OS to protect everything you keep on your phone. This initial beta features full disk encryption and basic platform management tools for Nexus S phones. WhisperCore presents a simple and unobstrusive interface to users, while providing powerful security and management APIs for developers. 
 Will be looking into this more deeply :-)  Maybe this would encourage Apple to provide hooks for similar software into iOS.

Friday, April 29, 2011

The Bigger They Are...

Rumblings started a week ago as the Sony Playstation Network went offline, and stayed offline.  I wasn't initially very concerned about this, but have been encouraged to look into it now that more information is available, and I have become much more concerned.

From the ominous note at http://us.playstation.com/news/consumeralerts/#us:
Although we are still investigating the details of this incident, we believe that an unauthorized person has obtained the following information that you provided: name, address (city, state, zip), country, email address, birthdate, PlayStation Network/Qriocity password and login, and handle/PSN online ID. It is also possible that your profile data, including purchase history and billing address (city, state, zip), and your PlayStation Network/Qriocity password security answers may have been obtained. If you have authorized a sub-account for your dependent, the same data with respect to your dependent may have been obtained. While there is no evidence at this time that credit card data was taken, we cannot rule out the possibility. [emphasis supplied]
 If you will recall, I was concerned about the identity theft / social engineering dangers from the Epsilon data breach.  This breach is much more serious because of the scope of information lost: everything necessary for successful identity theft, plus the potential for online identity takeover and even the possibility of credit card disclosure.  Reports have placed the record count at between 70 and 80 million!

The quantity of the confidential information involved here is stunning, and for an attacker to be able to obtain this volume of information in the matter of a couple of days seems extreme.  It would seem prudent for a company with this size and scope of a database to be using database access monitoring and data loss prevention systems.  It will be interesting to find out whether they actually did have essential business intelligence, monitoring, and policy enforcement systems in place.

Wednesday, April 27, 2011

Surprising Data Loss Vectors

With the 2011 Verizon Business Data Breach Investigation Report and breach after breach after breach recently in the news, you might be thinking that information security is all about malevolent actors right now.  The "black hats" seem to have become very good at targeting, infiltrating, and extracting valued data from desirable targets.

An article yesterday by Ellen Messmer at Network World spotlights another important issue in information security today: business partners sharing information insecurely.  At Lutheran Life Communities (LLC), when they installed Palisade Data Loss Prevention (DLP) systems , it was found that business partners were transmitting personal health information (PHI) insecurely to LLC.  LLC has chosen a practical response by warning business partners of the problem.

In my experience, this is not an isolated problem.  In the past, the DLP vendor community has highlighted the "insider problem" where employees -- usually just trying to do their jobs -- end up using poor business practices and cause frequent exposures of personal financial information (PFI) and/or personal health information, the two most highly-regulated types of personal identifying information (PII).  However, in numerous DLP installations I've observed, I have seen data inbound into organizations containing PFI and PHI violations, such as unwary customers sending credit card information in unsecured email messages into companies to request purchases.  I have also seen medical facilities where PHI was unexpectedly being transferred insecurely in and out of the organization, just as LLC noted in Ellen's article.

We have become aware of the risks of data loss.  Governments have begun enforcing data protection requirements.  We have developed policies and tools that have significantly raised the standards for protecting confidential information.  Let's put these tools and policies to good use.

Tuesday, April 19, 2011

Verizon Data Breach Investigations Report (DBIR) 2011

The Verizon Data Breach Investigations Report (DBIR) 2011 is hot off the virtual presses!  Rich Mogul has great first-pass analysis of the DBIR here.

The data is showing amazing variance year-over-year: the number of lost records has plummeted, but the number of breaches is growing.  Attacks on small to medium businesses are rising, and it appears criminals are focusing their efforts on quality rather than quantity of data.  Cyber crime has become a serious business, and criminals are following the money and paths of least resistance.

I attended the Verizon Business 2010 Data Breach Report session by Brian Sartin at RSA2011 and some of his key insights were:
  • Crimes are becoming commoditized and repeated
  • The number of records taken by criminals has dropped year-over-year since 2008 -- I am seeing the records that ARE taken are more targeted and valuable!
  • In 2010, internal agents involved in breaches jumped significantly -- including recently-terminated employees
  • 90% of cases involved data stored in places management were unaware of (e.g., unmanaged servers) -- speaks to the need for DLP discovery and endpoint

I think most of these trends have continued in 2011, so I'm curious to see what the DBIR has to say.

Guy

Tuesday, April 12, 2011

Barracuda Data Breach

Barracuda Networks, a computer security company whose ads you can't miss if you ever visit an airport, fell victim to a security breach over the weekend.  Barracuda has plenty of company in the computer security industry -- RSA and Comodo were also recent victims of security breaches.

Of course, people have been quick to excoriate security companies for security failures.  Alan Shimel and Bill Brenner have written good articles about the folly of thinking that this couldn't happen to any company.  Based on the number of significant breaches in the past few months, security companies may be targets right now for attacks.

Something to commend about the recent breaches: companies have been fairly responsible in reporting what has happened.  It can't be pleasant to announce a breach, but it is important to own up to what happened, and we can all learn lessons from what was vulnerable and how vulnerabilities were exploited.

Something else good: companies have been able to determine what happened and how using data from their monitoring and logging systems.

I hope that one of the lessons we learn from these breaches is to layer security technologies and compartmentalize subsystems so that failure of any one point does not result in exposure of the entire system.  Unfortunately, today we often have such complex systems that it is hard to make sure we have sufficient layers to manage the risks.

Wednesday, April 6, 2011

Chokepoints in a Network

A recent post on the Firemon blog got me to thinking again about the arguments for and against firewalls.  The effect of public cloud computing (IaaS and PaaS) has changed the situation -- strong firewalls sometimes can't be in front of every single server, and this seems to align with what I know of the Jericho Forum's positions on network security.  I still like firewalls as a tool where possible, and here is why.

When implementing servers, even systems that do not face public networks, one of the hardening steps I like to take is to implement as much access control and monitoring as I can.  Among the things I do is enable on-host packet filtering to ensure that only necessary network services are exposed, that only certain user groups are allowed to authenticate to the system, logging and monitoring systems are enabled, and no unnecessary services are running.  This is good security posture at the individual host level, but is only one or two layers of security in the onion.

Implementing firewalls and DMZ areas in a network enforces security boundaries and forces network designers to think about vulnerabilities and security profiles of different systems involved in a datacenter.  By the nature of firewalls, this enforces chokepoints in a network architecture.  Systems with different services and security profiles ought to be isolated in an organization's network for better control, monitoring, and management.

Tuesday, April 5, 2011

Epsilon Data Breach

There have been a number of data loss events in the past month, but the Epsilon data breach disclosed over the weekend has been most interesting.

Epsilon manages email-based marketing services for a number of large companies, so it had name and email address information for customers of the client companies.  This information was obtained by attackers.  While some have said the nature of the information means the breach is not significant, my immediate response to my peeps on Facebook was:
Yow -- this could enable some serious spear-phishing in the future :-(
Whoever has this information from Epsilon could simply use it for targeted spam.  More troubling, the attacker could spend some time working over the data with tools like MapReduce and mine profiles for customers to enable very targeted phishing email attacks.

In what I would say is a good, proactive response, Epsilon and its clients have been very quick to contact affected customers about the issue and let people know about the dangers of the information leak.  If there are any positive results from this breach, it should improve the security awareness of the average consumer and make companies think even more seriously about data loss prevention & database access monitoring.

Friday, April 1, 2011

From Clusters to Clouds

Over a decade ago, I was researching cluster computing -- tying together a large number of commodity computers to work together on a single, large task using a high-speed network as the interconnect.  The environment was fundamentally similar to what we see in cloud-computing data centers today -- large numbers of commodity computers tied together with high-speed networks to provide a service.

At the time, I was wondering what security issues would arise in clusters.  Since a cluster of computers was typically dedicated to a single user's task back in the day, it wasn't clear to me what access control and data security would be necessary other than making sure the assigned user was the only one with access to the cluster at a time.  As clusters evolved and scheduling arrived to divide clusters into parts to allow multiple jobs to run at a time in clusters, I wasn't involved in cluster computing anymore and didn't see what was happening with respect to security as clusters evolved into grids and clouds.

Today, the security issues are quite visible in what has become cloud computing.  Using an outside vendor's infrastructure (cloud) has becoming compelling for a number of reasons - scalability, elasticity, capital cost reductions, and more.  Visibility, compliance, jurisdiction, patch management, and other issues have become prominent in such a shared environment.

Looking back, as mainframes grew from single-user, batch processing systems to multiuser, timesharing systems, mainframe operating systems gained controls and visibility to enable secure multiuser operation.  Nowadays, groups like the Cloud Security Alliance are driving to improve the security and viability of cloud computing.

Tuesday, March 29, 2011

Re-Creating the Datacenter Via the Cloud

Over time, organizations have grappled with shifts in security resulting from changes in where and how data is stored.  As PCs entered the workplace, we changed from storing data in a monitored and managed data center to a distributed, networked environment.  We are still dealing with effects of this change as we implement policy and controls to identify, manage/protect, and monitor data stored on workstations and servers.

In an interesting perspective, Alan Shimel reported on an idea that General Alexander has proposed to use cloud computing to re-take control of U.S. military data by centralizing it in a well-managed cloud.  The essence of the approach is to store data in the cloud and use thin clients to access the data.  By implementing strong monitoring and controls in the cloud, data can be protected yet accessed wherever and whenever needed.

One of my concerns with cloud computing has been that users can sign up for cloud services and store confidential data anywhere in the world without having gone through the vetting necessary for corporate services (legal, governance & compliance, data lifecycle management, loss protection, etc).  If it is conceivable for an organization (especially the size of the U.S. military) to create a single, central cloud environment and manage access & data to that cloud, then that turns my perspective of cloud computing from being fraught with dangers of data loss into one where the cloud can enable strong data security.

Tuesday, March 22, 2011

Data Theft as a Business

I had the pleasure of sitting in on Kevin Poulsen's session at RSA 2011: From White Hat to Black - The Curious Case of Cybercrime Kingpin, Max Vision.  I also need to read his book, Kingpin to dig even further into this very interesting story. 

A brief recap of the story: after a stint in the joint, Max Vision needed a way to make money to live.  Unable to get steady, good-paying work in spite of his skills, he hooked up with a fellow that bankrolled Max's equipment and space needs.  Max used borrowed/stolen WiFi access to break into point-of-sale systems (among other things) and steal credit card data.  In a twist of irony, Max also hacked criminal credit card sharing sites and stole fresh credit card data from other criminals.  Selling this card info, and selling forged cards created using card data, netted Max and his partner significant sums of money.

A significant point in the story about Max Vision is how the cyber criminal underground has developed and how the economics of data theft have become profitable.

This is just one significant example of data theft; other theft continues, including theft of money from bank customer accounts, skimming at ATMs, and the recently-disclosed theft of something (exactly what is still secret) from RSA itself.

Many experts acknowledge it's not a question of if, but when, data loss could happen.  Criminals motivated by economic factors have become a significant threat, and this is even more reason to implement technologies and policies like access management to reduce exposure, discovery & endpoint protection to catalog and protect data, and access logging & data loss prevention to control and monitor use of data.

Thursday, March 17, 2011

Cloud Computing and Data Loss Prevention Implementation

I have been studying the Cloud Security Alliance's Security Guidance and other resources for the past few weeks with a focus on placement of Data Loss Prevention (DLP) capabilities.

At this time, it seems that the typical position for DLP in public clouds is in conjunction with other resources in private or public Infrastructure as a Service (IaaS).  Data-in-motion DLP systems in a cloud can be positioned logically adjacent to servers deployed in a cloud to monitor and protect information on those servers.  Data-at-rest and data-in-use DLP agents can be deployed on servers in the cloud to catalog and protect data on those servers.  However, there is nothing significantly new or better in these DLP implementation approaches than is currently available traditional servers.

What would be useful in cloud implementations is an API or specification to allow DLP interaction in all three major service models: Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS).  VMware's vShield family of products look like a step in this direction for the IaaS model: the vShield endpoint product looks like it potentially could enable data-at-rest DLP, but the network-oriented vShield products do not appear to provide direct access to network data streams to enable data-in-motion DLP.

I am looking forward to engaging with cloud computing vendors to see if we can create a generalized specification for access to cloud systems for DLP management and remove some of the fuzzy haze enveloping the data.

Updated:  Christopher Hoff blogged about the lack of security functionality in cloud services (IDS/IDP, WAF, DLP, etc) about a year and a half ago.  Do we dare hope that cloud providers are becoming any more interested in security than they were then?

Tuesday, March 15, 2011

My PC in the Cloud

Something I have been thinking about for a couple of years now is when or whether I'll be able to have everything I have today on my personal PC in the cloud.

My vision is to have all of my applications and data instantly available no matter where I go.  I'm not talking about GotoMyPC or Back To My Mac.  I want my computer hosted in a reliable cloud so I don't have to manage my own backups, worry about my own hardware (and its failures), manage my RAM and hard disk space dynamically, and have snapshots.  Basically, I want all the benefits of the cloud that I can get for corporate infrastructure, but on a personal / family basis.  I want the ease of access from thin clients (netbooks, tablets, and anything else) from anywhere on the Internet.

I realize using my applications like video editing may not work so well in a cloud environment, and now that I have made the switch to Mac, there are licensing issues as well for the operating system I would like to use.  I know many of the applications I use on my PC are available as SaaS offerings, but I don't want to get into the situation where all of my various forms of data are locked up in a variety of SaaS offerings hosted by a wide variety of companies, making it difficult to manage all the different access controls and account for all the different kinds of data stored in different places.

Security and reliability would become major factors for me if my PC were in the cloud.  I depend on firewalls, access controls, and strong passwords to secure my personal systems.  In the cloud, I would also need the assurance of privacy, encryption, audit trail, jurisdiction control, portability (if I were to need to change providers), and data lifecycle management -- everything about which I have concerns from a business perspective, but am even more worried about when it comes to my personal data.

Saturday, March 12, 2011

Virtualization and Security

As I am planning to lead my MIS445 class into virtualization and cloud computing in the coming weeks, I am pulling together a list of the fundamental technologies that have come together to enable cloud computing:
  • High-Speed Internet
  • Public Key Cryptography
  • Commodity multi-core CPUs
  • Storage-Area Networks
  • High-Performance Virtual Machine Hypervisors
  • Virtual LANs
  • Virtual Private Networks
Is there anything that you would add to the list?

Monday, March 7, 2011

101 Ways to Pwn a Network: DHCP

The past few class periods, as I have been teaching my MIS445 Networks & Security class about TCP, IP, DHCP, DNS, and routing, I have been digging into some of the threats at these different levels of the network.

Of course, I have mentioned things like DNS cache attacks, but the attack that really generated a lot of discussion was rogue DHCP servers.  This attack requires insider access to a network (not hard on an unsecured wireless network or an open university network), but it really makes life difficult for the network administrator when these things pop up.

More often than not, rogue DHCP servers are not maliciously placed in a network.  But what about a DHCP server that is maliciously added to a network to pwn all the outbound traffic?  Devices like Pwn Plug with the addition of a DHCP server and a passive traffic capture capability would be a heck of a way to listen in on interesting conversations in a network.

When talking to my class, I mentioned tools that can help find rogue DHCP servers.  I have heard of dhcpfind and dhcpexplorer, but I haven't used those tools before.   It would be nice to find a tool that runs under Linux or MacOS.

Friday, March 4, 2011

Tagging Your Data

If you are protecting sensitive, unstructured data that doesn't follow well-defined formats like personal health information or personal financial information, a common approach for Data Loss Protection (DLP) systems is to create fingerprints of the data and then check the fingerprints against outbound data or data stored on workstations and servers.

However, fingerprinting tends to require some active effort on the part of individuals and administrators to make sure the appropriate data has always been fingerprinted.  This involves steps that it seems the typical end-user or administrator doesn't always have time to perform.

I have seen a couple of technologies that could help with this situation.  They involve a simple pop-up dialog when a user saves or emails a document, and the user is quickly asked whether the data is sensitive.  If so, the document or email is tagged with an appropriate label or watermark, and subsequent use or transmission of the document or email can be tracked by DLP systems.

I am intrigued by this approach for a couple of reasons.  First, it helps keep data appropriately tagged (assuming a compliant user base, which works in the common case -- people generally want to do the right thing).  Second, it involves end users in the decisions about what data is sensitive, and helps keep users aware of the security implications of their work.

I am interested to get feedback on how others feel about this approach.

Wednesday, March 2, 2011

Losing Your Data in the Cloud

In a high-profile event Monday, Google Mail disappeared for a large number of users.  Google has since recovered the missing email from backups and blamed the issue on a software problem.

This is a great learning moment: if your data is stored in the cloud, does the provider adequately backup the data?  In this case, even though Google keeps multiple copies of data, the software problem caused all online copies of the data to be lost.  Fortunately for its users, Google had offline copies of the data and was able to restore from backups.

As a Google Mail user, I was not even aware of the policies and procedures Google has for managing the data in the Google Mail system.  I, as millions of others, had signed up for Google Mail because of the free service with lots of storage.  I have stayed with Google Mail because of the price and functionality, and it has been reliable for the many years I have used it.

The due diligence and risk management issues are being documented as part of cloud provider analysis best practices strategies by the Cloud Security Alliance.  As we've just seen, it is important to evaluate and know a cloud provider's security and reliability plans and practices before trusting important data to that provider.

Wednesday, February 23, 2011

Security and Cloud Computing

Two of the big take-aways from the RSA Conference last week:

1) Cloud computing (in all its forms) presents substantial new challenges to an organization's data security and risk management plans.  A speaker in one of the sessions made an interesting point (sorry, I don't have the speaker's name in my notes): organizationally, we've been through a similar sea change before: when PCs invaded businesses roughly 25 years ago, and the data that had been carefully kept in a centralized computing infrastructure spread out into the personally-managed, unsecured personal computers.  Like it was "easy" for employees to bring personal computers into an organization, now it is "easy" for employees to sign up for cloud computing services and start storing protected information outside the organization's control.

We need better ways for organizations to know where in the cloud its information resides, who is putting the data into the cloud, who is accessing the data, and manage the risk of that information.

2) Cloud computing offers very handy new ways to deliver security functionality to customers.  Web application firewalls, data loss prevention, email anti-virus and anti-spam, and other technologies provided as cloud services offer convenient new capabilities for customers, and new market opportunities for providers.

As a result, I think that delivering security functionality as cloud services will help make it easier to provide security for mobile devices, particularly laptops at this point.  I hope we can drive smart phones and tablets towards better security through cloud offerings as well.

Monday, February 21, 2011

Back from RSA 2011 Conference

I'm back from the RSA 2011 Conference.  What an incredible opportunity to meet and speak with others in the industry and find out what is happening across the entire spectrum of security needs, policy, and products.  Near term, I'm planning to read more of the information published by the Cloud Security Alliance and become more familiar with the state of security in cloud computing.

Personally, it was very interesting to hear luminaries including Whitfield Diffie, Bruce Schneier, Ron Rivest, Adi Shamir, Len Adleman, and Dickie George talk about the foundations of cryptography and how their work has enabled modern computing and information security, especially cloud computing.  Great stuff.

Friday, February 11, 2011

WikiLeaks and Business Data

With all the buzz around the exposure of significant amounts of confidential data on the WikiLeaks web site the past few months, attention has been rising on the role of Data Loss Prevention (DLP) to help protect information.

Especially for small and medium businesses, the focus is on giving employees the access to everything they need to get work done.  Access security is baked into operating systems and networks with things like accounts, groups, and firewalls, but the facts for small and mediums businesses are 1) employees have to be generalists so most employees have access to most everything, 2) access management and monitoring get little, if any, attention, and 3) emphasis is on getting the job done, but most employees have no idea of the exposures they are causing by using common tools (e.g., email) to transfer confidential information.

With all these limitations working against good protection of information, it's even more important for small to medium businesses to implement Data Loss Prevention systems.  DLP can help train employees to use better practices for protecting information by responding to well-intentioned but dangerous activities with "sorry, this was blocked" responses, and DLP can help prevent malicious exposures too.  All this can help avoid a "WikiLeaks" moment that can really harm a business.

Thursday, February 10, 2011

Java Security

When Java was new, one of the language's touted features was its safety as it ran programs in its virtual machine in a "sandbox" to thwart malicious code.  It's a useful idea to run security-related code under a trusted monitor, and it's an approach that Adobe is adopting for its applications that have been the target of attacks lately.

However good the sandbox approach is, Java has had quite a few exploitable vulnerabilities.  Brian Krebs has uncovered exploit packs available in the criminal underground that target Java.  For these exploit packs to be useful to attackers, there needs to be a large-enough installed base so that the exploits will be effective.  Java's ubiquity and number of vulnerabilities, as well as a lack of automatically-installed updates (and don't get me started about the updater always wanting to install a browser toolbar!), means there is a wide base of computers on the Internet that can be successfully attacked.

There was a surprising denial-of-service vulnerability brought to light this week in Java involving parsing a particular set of floating-point numbers from string data (or even compiling one of these numbers in a Java program).  Today, Oracle has widely publicized the patch for the problem.  I hope this particular event brings the issue of keeping Java up-to-date to the fore so we can reduce the number of computers vulnerable to the "bad guys".

Wednesday, February 9, 2011

RSA 2011 Conference

I'm heading to the RSA 2011 Conference next week.  I'm planning to hit it hard with lots of sessions, meetings, and fun to learn more about protecting data, especially in cloud and mobile environments with emphasis towards small and medium enterprises that can really make great use of cloud and mobile offerings.

See you there!

Tuesday, February 8, 2011

Cloud Computing

The term "cloud computing" has meanings so wide-ranging that it is difficult to pin down.  It can mean Infrastructure as a Service (IaaS), like Amazon's cloud (public cloud) or a rack of VMware servers in a company's data center (private cloud).  It can mean Software as a Service (SaaS), like Google Mail and Google Docs services.  Then there are Platform as a Service (PaaS) offerings, such as easy-to-build websites such as GoDaddy or Network Solutions offerings.

Many of these offerings involve storing data or moving data outside of the protected domain of a company's internal network.  Even for data kept in an internal private cloud, security and compliance issues can be complicated by storage and transfer of data between systems that used to be physically separated and more "visible" to analysis by firewalls, intrusion detection (IDS/IPS) systems, and data loss prevention (DLP) systems.

I have worked with a number of companies deploying security solutions into private clouds, and am planning to teach my students about management and security issues in cloud computing this semester.  I am also researching putting security systems, such as DLP systems, into public clouds to provide Software as a Service offerings for easier accessibility and scalability.  As with the range of definitions for cloud computing offerings, the range of security issues involved in cloud computing can be overwhelming.

As I attend the RSA Conference 2011 next week, I plan to dig deeper into security, compliance, and legal issues in cloud computing.  It will be great to compare notes with others who are concentrating full-time on these cloud computing issues, and I plan to bring back lots of technical and operational guidance for both my students and the people I work with.

Monday, February 7, 2011

Firesheep

There's an extension for the Firefox web browser called Firesheep.  For those who install it, it allows passive capture of cookies for web sites.  Why is it a big deal?

For anyone who uses an unencrypted WiFi network, it means a "bad guy" with Firesheep can easily steal their web site cookies and use them to access private web sites.  If you ever use an unencrypted WiFi network, such as at a coffee shop, airport, or anywhere else, your Facebook, Google Mail, or other personal web accounts could be compromised.

Solutions?

1. Only use WiFi networks encrypted with WPA or WPA2.  Usually, this involves using a WPA-PSK or WPA2-PSK password.
2. If you use an unencrypted WiFi network, only use SSL (Secure Sockets Layer) security.  Some web sites don't provide SSL-enabled access, though.
3. Use a VPN (such as through your corporate network) when working from a WiFi network.
4. Only use a wired network.

Friday, February 4, 2011

Intro

Welcome to Info Loss.  I'm Guy Helmer, CTO of Palisade Systems and lecturer at Iowa State University in the College of Business.

My professional focus is on keeping data and systems safe.  Over the past two decades I have researched and engineered information systems that get work done while keeping information secure.  I'm teaching students about software development and network systems at ISU using these same principles, and I'm building systems to help small and medium businesses protect their data.

I'll use this blog to discuss topics, big and small, relating to data protection as the world of computing and networks continues to grow and evolve.