“Just Internal” is No Longer the Case. What organizations need to do to protect themselves given the CURES Act.


By Mitch Parker, CISO, Indiana University Health

The CURES Act Final Rule’s provision requiring healthcare providers to give access to the health information in their electronic medical records without delay is coming on April 5. This means that patients will be able to use their application of choice to access their medical records and store them on their devices. This also means that providers will have to open secured Application Programming Interfaces (APIs) for the applications to access this data. There are security requirements around these APIs, mainly the use of OAuth2 authentication and Transport Layer Security (TLS) version 1.2 or greater.  This can help ensure that the transfer of data to our patients is secured.

Given the environment that we are in, the consequences for having unencrypted info flying around due to HITECH, and the implicit encryption and security requirements in the CURES Act Final Rule, it’s time to revisit what’s unencrypted on our internal networks.

However, we also need to look internally to ensure that the provenance of patient data is protected from its point of origin to its ultimate destination on our patients’ devices. Most healthcare systems have been running numerous technologies since before HIPAA and the Security Rule were published as a Final Rule in 2003. The Security Rule was written and drafted long before encryption was pervasive. 45CFR 164.312( e)(1), Standard:  Transmission security, requires us to implement technical security measures to guard against unauthorized access to Electronic Protected Health Information (ePHI) that is being transmitted over an electronic communications network.

What this part of the rule means is that we must ensure the confidentiality, integrity, and availability (CIA) of ePHI as it is transmitted over a network. The implementation specifications for Transmission Security were marked as addressable. According to the Federal Register of February 20, 2003, addressable has three components:

  • If a given addressable implementation specification is determined to be reasonable and appropriate, the covered entity must implement it.

  • If a given addressable implementation specification is determined to be an inappropriate and/or unreasonable security measure for the covered entity, the covered entity may implement an alternative measure that accomplishes the same end as the addressable implementation specification.

  • A covered entity may decide that a given implementation specification is simply not applicable and that the standard can be met without the implementation of an alternative measure in place of the addressable implementation specification.

If the measure cannot be implemented, the risk mitigation steps taken must be documented. When the Security Rule was written, the use of secure channels was novel. While e-commerce sites utilized them, they were expensive and prohibitive to set up. Malware that exfiltrated patient data was also not prevalent. It was not difficult to make the argument that implementing encryption was not reasonable and appropriate, and that keeping data on an internal network was feasible.

With the Health Information Technology for Economic and Clinical Health (HITECH) Act, passed as part of the American Recovery and Reinvestment Act (ARRA) of 2009, there was an expansion of the HIPAA compliance requirement set. According to Entrust, this requires the disclosure of data breaches of “unprotected”, aka unencrypted personal health records. This also includes those by business associates, vendors, and related entities. One of the interpretations of the HITECH Act has been that if the data could be seen by others between its source and destination, it could be considered breached.  Another is that if the data is unprotected at rest using encryption and algorithms approved by NIST, then the data is also considered breached. 

Also, given the time we are in, we have three alternatives. Microsoft and other vendors have made the use of secure channels pervasive, and only offer secure versions by default for their Application Programming interfaces. The use of Secure Shell (SSH) is also now pervasive and is now a default in Windows 10. You can use it to “tunnel” insecure traffic over an encrypted channel. Virtual Private Networks are also now common, and can be easily configured, even within your own virtual environments.

The use of legacy file transfer mechanisms such as older Health Level 7 implementations, File Transfer Protocol (FTP), Server Message Block, Network File System (NFS), internal e-mail, or older versions of Secure Shell don’t work well to protect patient data.  These protocols do not protect the confidentiality, integrity, or availability of data.  They also often force organizations to run older applications and services that cannot be configured to support newer services such as FHIR.  The old agage of “it ain’t broke, don’t fix it” does not apply to the protection of patient data from threats to confidentiality, integrity, or availability.  Upgrading these transfer mechanisms, applications, and services will put you in a better position to provide customers what they need as part of the CURES Act.

What you and your organizations need to do is to look at any unencrypted traffic you have with those legacy systems and find out how to encrypt it in transit using one of these three methods. Either you or your vendors need to address these for your applications. Given the environment that we are in, the consequences for having unencrypted info flying around due to HITECH, and the implicit encryption and security requirements in the CURES Act Final Rule, it’s time to revisit what’s unencrypted on our internal networks. Just Internal is just no longer the case.

error

Share now:

LinkedIn
LinkedIn
Share