Tom Olzak

Posts Tagged ‘ePHI’

Yes, sensitive data on QA and Development servers is still sensitive

In Access Controls, Business Continuity, Data Security, Network Security, Security Management on August 18, 2009 at 11:48

Any organization with an effective software development lifecycle (SDLC) builds QA and development environments to test new or upgraded systems.  Testing, either unit (developer) or user acceptance (UAT), requires data available to the application which looks very close to production data, including construction of all data dependencies.  The fastest way to make this happen is to copy production data into the test and development databases.  However, perception of the sensitivity of data in these non-production environments is often… well… wrong.

I like to practice data-centric security.  This means security controls are about protecting sensitive data and access by critical systems to that data.  So if someone moves a customer database, for example, to a development server the data should be protected with the same controls used to protect it in production.  Organizations often use a system-centric approach to security, assuming that servers, workstations and data not in the production environment don’t require the same level of trustworthiness.

Research commissioned by enterprise applications vendor Micro Focus and carried out by the Ponemon Institute surveyed 1,350 application development staff at UK and US firms with turnover between $10m (£6.1m) and $20bn-plus.

The past 12 months have seen data breaches at 79 per cent of respondents, with the same amount using live production data in application development and testing. But just 30 per cent of firms mask this data during the process.

Application testing takes place on at least a weekly basis at 64 per cent of companies, with 90 per cent claiming it happens once a month or more. A mere seven per cent of respondents said data protection procedures were more rigorous during development and testing than during normal production.

Source: Lax data masking hits four in five firms, Sam Trendall, CRN, 18 August 2009

Granted, the purpose of the study was ostensibly to promote a data masking solution.  But it demonstrates the need for better focus on non-production data stores.  In other words, data in QA and development systems must be managed with the same rigor as that residing in production.  And if extending security controls to these systems is not feasible, then data masking is necessary.

Blame the auditors: What a concept!

In Business Continuity, Data Security, Network Security, PCI DSS, Risk Management, Security Management on August 13, 2009 at 08:02

I have never thought of this.  After a breach, just blame the auditors.  Wait.  The reason I hadn’t thought of it is because passing a compliance audit IS NOT ASSURANCE OF SECURITY.  But some still don’t get it.

In an interview with CSO’s Bill Brenner, Heartland Payment Systems’ CEO, Robert Carr, blamed his QSA auditors for a recent (huge) breach.  Because they said his organization was PCI compliant, he felt secure.  Wow.  Security by checklist once again.

Rich Mogull, in an open letter to Carr, makes several excellent points about reliance on compliance instead of solid security practices.  He concludes his letter with,

But, based on your prior public statements and this interview, you appear to be shifting the blame to the card companies, your QSA, and the PCI Council. From what’s been released, your organization was breached using known attack techniques that were preventable using well-understood security controls.

As the senior corporate officer for Heartland, that responsibility was yours.

Source: An Open Letter to Robert Carr, CEO or Heartland Payment Systems, Rich Mogull, 12 August 2009

Rich’s letter is a good read, and it should be circulated widely among security professionals and senior executives. 

Among other things, this is another case where an organization is falling back on a completed checklist representing compliance with the PCI standard, a bare minimum set of security requirements.  But whether you are HIPAA, GLBA, or PCI compliant, checking off on recommended practices doesn’t equal security.

Each of us is responsible for placing compliance activities within the proper context: guidelines within a broader security program.  No regulatory or industry standards can protect our critical infrastructure or sensitive data.  Only an aware, thinking human who actually cares about security—and understands how standards apply within his or her unique environment—can do that.

Send secure email free, including attachments

In Data Security, Email on July 7, 2009 at 18:48

The other day (or once upon a time, whatever), I tried to use Gmail to send an attachment encrypted with SecureZIP.  I was quickly reminded by the Google email service that it didn’t allow encrypted attachments.  So I tried our restaurant’s Yahoo mailbox.  Same result.  I needed to send a secure attachment, and I didn’t want to sign up for a for-fee service to do so.  So I searched the Web for a free secure mail service.  I found two which show promise: Lockbin.com and SendInc.com.

Lockbin.com was simple to use.  After accepting the user agreement and entering a CAPTCHA string, I was presented with the text entry form shown below.  Since the connection established with the site was encrypted (HTTPS), anything I entered and sent was safe from unauthorized sets of eyes.

Lockbin Text Entry

I entered a short test message and clicked Continue.  The next window (below) prompted for a password to lock the message until picked up by the recipient.  The password, or “Secret Word,” has to be sent to the person receiving the message via standard email, phone call, text message, etc.  I entered a password and clicked Continue.

linkbinword 

Finally I was prompted for my name, my email address and the recipient’s email address.  I was also shown how the alert message would look when it showed up in the destination mailbox.  The text was not editable at this point.  Clicking enter again, the message was sent. 

Since I had sent the test message to one of my addresses, an alert quickly appeared in my mailbox (shown below) letting me know I had a secure message to retrieve.  To read the message, I clicked the link as instructed.  This opened a secure session with Lockbin.com.  After entering the password I provided when I sent the message, I was shown the message text.  Simple, but not quite what I needed.

lockbinMail

There are two potential issues with Lockbin.  First, the sent email is deleted from the Lockbin server as soon as the recipient opens it.  If the person you correspond with doesn’t understand this, you might find yourself resending it.

Second, Lockbin doesn’t support attachments.  This is OK if what you want to share is a small list of private data.  However, I needed to send a complete document.  So on to SendInc.

Like Lockbin, SendInc is a free secure email service which requires no downloads.  But unlike the first solution, SendInc is a better fit for home office or small business use.

With SendInc, I can send up to twenty messages per day.  This would be a serious limitation for larger businesses, but it’s fine for my needs.  And although there is a send limit, I can receive an unlimited number of secure messages.  The best thing about SendInc, however, is that I can include attachments up to 10 MB.

With Lockbin, no account is necessary.  This is not true with SendInc.  This is probably due to the eventual offering of a for-fee service for users with a need for more than 20 outgoing secure messages per day.  SendInc knew immediately after I entered my email address that I didn’t have an account.  I was presented with an account activation form.  Once the form was complete, I entered an activation code sent as the final form completion step.  Now I was ready to enter the test message, as shown below.

SendMailEntry

After entering my test message and attaching a 5 MB Word attachment, I clicked the send button at the bottom of the form.  The email was  immediately processed, and I received a notification in my Gmail account.  The following image shows the contents of the alert.

Sendincreceived

Again, I simply clicked the provided link to establish a secure session with SendInc.  However, the Gmail account I sent the message to was not registered with SendInc.  So I was required to activate an associated account with a form similar to the one I completed when activating the sending account, as shown in the following image. 

SendActivate

Once both accounts were activated, I was able to send and receive secure messages with them by supplying the relevant passwords.  Messages once processed are not retained by SendInc.

Both of these solutions work as advertised.  Neither are perfect, and I wouldn’t use them to share national defense secrets.  But I don’t deal with national security issues.  For quick messages without an attachment, Lockbin is certainly easier to use.  For attachments, there is always SendInc.

Beware Regulatory Hysteria

In Data Security, Government, HIPAA, Policies and Processes, Privacy on June 13, 2009 at 09:18

Regulatory Hysteria: Knee-jerk overreaction to new regulations, often placing individual privacy at risk.

For years, since before HIPAA and SOX, organizations have often overreacted to government mandates.  Some of the blame falls on accountants and security consultants who don’t understand the law, are trying to make a few extra bucks, or are simply covering their own butts. In other cases, organizations simply suffer from what I call regulatory hysteria.  Whatever the reason, overreacting to regulatory requirements can sometimes put customers and employees at greater risk.

Sherri Davidoff writes about a recent incident in which she appears to have been personally involved.  The post, located at philosecurity.org, describes the results of the FACTA and its Red Flag Rules on patient privacy.

Sherri was apparently confronted with a notice of a new requirement to produce a photo ID when she visited her doctor.  Since she didn’t have one, the office staff wouldn’t process her for her appointment.  While she stood there, Sherri observed staff scanning patient driver’s licenses for filing in their computer system.  Sherri was upset that she was inconvenienced and about her doctor demanding additional personal information.  Was she justified?  Maybe.

First, the Red Flag Rules are designed to protect us from criminals who seek to steal our identities for financial gain, including using our health insurance.  Health insurance theft is a big problem and growing.  The rules also help ensure someone can’t receive care under your name and have those results placed in your records, with the possible result of you receiving harmful care based on invalid assumptions about your health.  They are a good idea, and Sherri should simply get a photo ID—although there are other ways to verify identity, and the doctor might try to be a little more flexible.

Scanning of licenses or other photo IDs, however, is another matter.  There is no requirement to scan and store proof of identity.  The requirement is to demonstrate documented processes to:

  • Verify a potential patient’s identity
  • Report possible identity theft

This particular case looks like butt-covering rather than reasonable and appropriate compliance with the law.  And even if Sherri did produce a photo ID, how much effort is actually taken by the office staff to verify the ID itself?  What training did the staff receive to help them identify fraudulent documents?  Do they even compare the photo—I mean actually look at it—with the person standing in the reception window?  These are more important considerations than getting a scanned copy of a photo ID.  Finally, does the office staff simply accept verbal confirmation of identity for future visits once a scanned ID is in the system?  I hope their scanner is better than most, or picture quality will be close to worthless.

The other issue Sherri wrote about was her concern about the office potentially storing additional information about her in their computer system.  If the office is HIPAA compliant, and ePHI is protected in accordance with the security rule, this shouldn’t be an issue.  If it isn’t, Sherri has bigger problems than not having a photo ID or having an ID scanned.

My problem with Sherri’s visit is different from hers.  There is apparent compliance with the Red Flag Rules.  However, compliance extends far beyond a simple scan of an ID.  If the office manager simply uses the scans as evidence that an ID was produced without requiring trained employees to follow an actual identity verification process, then there is no compliance—just the appearance of compliance.  I think Sherri should be more concerned with how the office staff verifies her identity during each visit, and whether they are actually compliant with the HIPAA security rule, than whether they require a photo ID.

System physical security should include mobile device asset management

In Access Controls, HIPAA, Physical Security, Piracy Legislation on May 27, 2009 at 21:43

Some organizations spend a lot of time worrying about administrative (policies) and logical (application and system electronic) access controls without much concern for physical security.  I don’t mean the kind of physical security where you make sure your data center is locked.  I mean the kind of security which allows you to track who has your resources and ensures your organization takes the right steps to quickly mitigate impact.

For example, it doesn’t make much sense to lock the data center when unencrypted, unmanaged mobile devices travel across the country.  The sensitive information stored safely in the data center might as well be in the lobby.  This might seem a basic principle, but many organizations still don’t get it.  Take the US Department of the Interior, for example.  According to a report completed last month by the department’s inspector general, Western Region,

…13 computers were missing and… nearly 20 percent of more than 2,500 computers sampled could not be specifically located.  Compounded by the Department’s lack of computer accountability, its absence of encryption requirements leaves the Department vulnerable to sensitive and personally identifiable information being lost, stolen, or misused.

Source: Evaluation of the Department of the Interior’s Accountability of Desktop and Laptop Computers and their Sensitive Data, U.S. Department of the Interior, Office of the Inspector General, 24 April 2009.

So the IG could verify the loss of 13 unencrypted computers, but about 500 were simply unaccounted for.  The reason? Several of the agencies within the department had no process to track computer inventory.  The following is from a related InternetWorld article:

Despite policies mandated by the Federal Information Systems Management Act and other regulations, including rules that say computers should not be left unattended in plain view and that organizations should establish policies to protect their systems from unauthorized access, the Department of the Interior doesn’t require that any hardware that costs less than $5,000 — that would cover most PCs — be tracked in an asset management system, and the current tracking system doesn’t have proper backing, according to the report.

Source: Department Of The Interior Can’t Locate Many PCs, J. Nicholas Hoover, InformationWeek, 27 April 2009

Most of us agree that encryption is a necessary part of any mobile device security strategy.  But why worry about tracking laptops?  Isn’t encryption enough to render the data on a lost or stolen laptop inaccessible?  Well, it depends.

Many organizations do not use strong passwords.  The reasons vary, including:

  • Users tend to write complex passwords down, leaving then easily accessible
  • Password reset calls constitute a high percentage of help desk calls, rising exponentially as password complexity increases

In other words, strong passwords are often seen as weaker and more costly to the business than simple passwords.  And password complexity tends to remain the same when an organization implements full disk encryption, raising concern about the real effectiveness of scrambling sensitive information.  The complexity of the password and the configuration of the login policy (i.e., history, failed login attempt, etc.) are factors in the strength of any encryption solution.  In any case, encryption solutions should be supplemented to some degree—depending on the organization—by a mobile device physical management process, including,

  • Mobile device assignment process which includes recording employee name and date of assignation
  • Clearly documented mobile device usage and protection policy signed by each employee before he or she receives a mobile device
  • Periodic, random verification that the assigned user still has physical control of the device
  • Strict employee termination process which includes receipt of assigned devices
  • Documented device end-of-life process, including
    • recording receipt of device
    • recording of device disposition, in accordance with the organization’s media sanitation and reuse policy
  • Tested and documented device loss process, including
    • process for reporting a mobile device lost or stolen
    • assessment of the probability of sensitive data breach and notification of affected individuals
%d bloggers like this: