Friday, April 29, 2011

The Bigger They Are...

Rumblings started a week ago as the Sony Playstation Network went offline, and stayed offline.  I wasn't initially very concerned about this, but have been encouraged to look into it now that more information is available, and I have become much more concerned.

From the ominous note at
Although we are still investigating the details of this incident, we believe that an unauthorized person has obtained the following information that you provided: name, address (city, state, zip), country, email address, birthdate, PlayStation Network/Qriocity password and login, and handle/PSN online ID. It is also possible that your profile data, including purchase history and billing address (city, state, zip), and your PlayStation Network/Qriocity password security answers may have been obtained. If you have authorized a sub-account for your dependent, the same data with respect to your dependent may have been obtained. While there is no evidence at this time that credit card data was taken, we cannot rule out the possibility. [emphasis supplied]
 If you will recall, I was concerned about the identity theft / social engineering dangers from the Epsilon data breach.  This breach is much more serious because of the scope of information lost: everything necessary for successful identity theft, plus the potential for online identity takeover and even the possibility of credit card disclosure.  Reports have placed the record count at between 70 and 80 million!

The quantity of the confidential information involved here is stunning, and for an attacker to be able to obtain this volume of information in the matter of a couple of days seems extreme.  It would seem prudent for a company with this size and scope of a database to be using database access monitoring and data loss prevention systems.  It will be interesting to find out whether they actually did have essential business intelligence, monitoring, and policy enforcement systems in place.

Wednesday, April 27, 2011

Surprising Data Loss Vectors

With the 2011 Verizon Business Data Breach Investigation Report and breach after breach after breach recently in the news, you might be thinking that information security is all about malevolent actors right now.  The "black hats" seem to have become very good at targeting, infiltrating, and extracting valued data from desirable targets.

An article yesterday by Ellen Messmer at Network World spotlights another important issue in information security today: business partners sharing information insecurely.  At Lutheran Life Communities (LLC), when they installed Palisade Data Loss Prevention (DLP) systems , it was found that business partners were transmitting personal health information (PHI) insecurely to LLC.  LLC has chosen a practical response by warning business partners of the problem.

In my experience, this is not an isolated problem.  In the past, the DLP vendor community has highlighted the "insider problem" where employees -- usually just trying to do their jobs -- end up using poor business practices and cause frequent exposures of personal financial information (PFI) and/or personal health information, the two most highly-regulated types of personal identifying information (PII).  However, in numerous DLP installations I've observed, I have seen data inbound into organizations containing PFI and PHI violations, such as unwary customers sending credit card information in unsecured email messages into companies to request purchases.  I have also seen medical facilities where PHI was unexpectedly being transferred insecurely in and out of the organization, just as LLC noted in Ellen's article.

We have become aware of the risks of data loss.  Governments have begun enforcing data protection requirements.  We have developed policies and tools that have significantly raised the standards for protecting confidential information.  Let's put these tools and policies to good use.

Tuesday, April 19, 2011

Verizon Data Breach Investigations Report (DBIR) 2011

The Verizon Data Breach Investigations Report (DBIR) 2011 is hot off the virtual presses!  Rich Mogul has great first-pass analysis of the DBIR here.

The data is showing amazing variance year-over-year: the number of lost records has plummeted, but the number of breaches is growing.  Attacks on small to medium businesses are rising, and it appears criminals are focusing their efforts on quality rather than quantity of data.  Cyber crime has become a serious business, and criminals are following the money and paths of least resistance.

I attended the Verizon Business 2010 Data Breach Report session by Brian Sartin at RSA2011 and some of his key insights were:
  • Crimes are becoming commoditized and repeated
  • The number of records taken by criminals has dropped year-over-year since 2008 -- I am seeing the records that ARE taken are more targeted and valuable!
  • In 2010, internal agents involved in breaches jumped significantly -- including recently-terminated employees
  • 90% of cases involved data stored in places management were unaware of (e.g., unmanaged servers) -- speaks to the need for DLP discovery and endpoint

I think most of these trends have continued in 2011, so I'm curious to see what the DBIR has to say.


Tuesday, April 12, 2011

Barracuda Data Breach

Barracuda Networks, a computer security company whose ads you can't miss if you ever visit an airport, fell victim to a security breach over the weekend.  Barracuda has plenty of company in the computer security industry -- RSA and Comodo were also recent victims of security breaches.

Of course, people have been quick to excoriate security companies for security failures.  Alan Shimel and Bill Brenner have written good articles about the folly of thinking that this couldn't happen to any company.  Based on the number of significant breaches in the past few months, security companies may be targets right now for attacks.

Something to commend about the recent breaches: companies have been fairly responsible in reporting what has happened.  It can't be pleasant to announce a breach, but it is important to own up to what happened, and we can all learn lessons from what was vulnerable and how vulnerabilities were exploited.

Something else good: companies have been able to determine what happened and how using data from their monitoring and logging systems.

I hope that one of the lessons we learn from these breaches is to layer security technologies and compartmentalize subsystems so that failure of any one point does not result in exposure of the entire system.  Unfortunately, today we often have such complex systems that it is hard to make sure we have sufficient layers to manage the risks.

Wednesday, April 6, 2011

Chokepoints in a Network

A recent post on the Firemon blog got me to thinking again about the arguments for and against firewalls.  The effect of public cloud computing (IaaS and PaaS) has changed the situation -- strong firewalls sometimes can't be in front of every single server, and this seems to align with what I know of the Jericho Forum's positions on network security.  I still like firewalls as a tool where possible, and here is why.

When implementing servers, even systems that do not face public networks, one of the hardening steps I like to take is to implement as much access control and monitoring as I can.  Among the things I do is enable on-host packet filtering to ensure that only necessary network services are exposed, that only certain user groups are allowed to authenticate to the system, logging and monitoring systems are enabled, and no unnecessary services are running.  This is good security posture at the individual host level, but is only one or two layers of security in the onion.

Implementing firewalls and DMZ areas in a network enforces security boundaries and forces network designers to think about vulnerabilities and security profiles of different systems involved in a datacenter.  By the nature of firewalls, this enforces chokepoints in a network architecture.  Systems with different services and security profiles ought to be isolated in an organization's network for better control, monitoring, and management.

Tuesday, April 5, 2011

Epsilon Data Breach

There have been a number of data loss events in the past month, but the Epsilon data breach disclosed over the weekend has been most interesting.

Epsilon manages email-based marketing services for a number of large companies, so it had name and email address information for customers of the client companies.  This information was obtained by attackers.  While some have said the nature of the information means the breach is not significant, my immediate response to my peeps on Facebook was:
Yow -- this could enable some serious spear-phishing in the future :-(
Whoever has this information from Epsilon could simply use it for targeted spam.  More troubling, the attacker could spend some time working over the data with tools like MapReduce and mine profiles for customers to enable very targeted phishing email attacks.

In what I would say is a good, proactive response, Epsilon and its clients have been very quick to contact affected customers about the issue and let people know about the dangers of the information leak.  If there are any positive results from this breach, it should improve the security awareness of the average consumer and make companies think even more seriously about data loss prevention & database access monitoring.

Friday, April 1, 2011

From Clusters to Clouds

Over a decade ago, I was researching cluster computing -- tying together a large number of commodity computers to work together on a single, large task using a high-speed network as the interconnect.  The environment was fundamentally similar to what we see in cloud-computing data centers today -- large numbers of commodity computers tied together with high-speed networks to provide a service.

At the time, I was wondering what security issues would arise in clusters.  Since a cluster of computers was typically dedicated to a single user's task back in the day, it wasn't clear to me what access control and data security would be necessary other than making sure the assigned user was the only one with access to the cluster at a time.  As clusters evolved and scheduling arrived to divide clusters into parts to allow multiple jobs to run at a time in clusters, I wasn't involved in cluster computing anymore and didn't see what was happening with respect to security as clusters evolved into grids and clouds.

Today, the security issues are quite visible in what has become cloud computing.  Using an outside vendor's infrastructure (cloud) has becoming compelling for a number of reasons - scalability, elasticity, capital cost reductions, and more.  Visibility, compliance, jurisdiction, patch management, and other issues have become prominent in such a shared environment.

Looking back, as mainframes grew from single-user, batch processing systems to multiuser, timesharing systems, mainframe operating systems gained controls and visibility to enable secure multiuser operation.  Nowadays, groups like the Cloud Security Alliance are driving to improve the security and viability of cloud computing.