Note: On April 14, 2001, the first bit of HIPAA regulation, the HIPAA Privacy Rule, became effective. It was not enforceable for 2 more years, and was followed by the Security Rule, HITECH, the Breach Notification Rule, etc., but the “s*#t got real” 20 years ago today (I started this blog on March 8, 2002). And so, today begins a series of 20 big blog posts celebrating and explaining 20 big ideas, facts, stories, or peculiarities about HIPAA. It’s an opportunity for me to pull back and highlight some major themes and lessons I’ve learned playing in this space for the last 21 years.
Chapter 1: “Who is that behind those Foster Grants (with the huge stack copies of the Federal Register)?”
Facebook got sued under the Telephone Consumer Protection Act, which puts restrictions on spam callers (according to my cell phone, not nearly enough restrictions!!), for sending texts to members. But the court noted that Facebook used numbers it already knew from members it already had, not randomly generated or sequentially stored numbers. In other words, the TCPA rules don't apply if there's some sort of intelligence or customer-related information that determines who gets the call or text.
This helps providers who want to use patients' cell phone numbers they have to send text reminders, but can't charge for them, must allow opt-out/unsubscribe, and can't do billing, advertising, or marketing that way. That's nice, but the key is you must still comply with HIPAA, and there's a lot of good reasons to say that texting might not be HIPAA-compliant at all.
I don't think texting automatically violates HIPAA, but some patients do. Texting definitely isn't as secure as using encrypted email or a portal -- most people set their phones so they can see the name of the sender and the first line of the most recently received text before unlocking the phone. That means random people picking up the phone can see who sent the text and some of the content. Obviously, that's problematic.
If you're considering this, think about ways to limit your HIPAA exposure. Have the patients sign a consent anyway, making sure they understand the risks before agreeing to accept texts. Allow them to opt out at any time. Make the "sender" name as generic as possible, especially if the provider name is obviously connected to a particular disease. Make sure the first line or two is a generic greeting; the less PHI visible, the better.
There's more, obviously. Definitely something you want competent HIPAA counsel to help you with. So, text me, OK?
Leglslation and regulations generally require certain behaviors; the threat of fines, jail time, and lawsuits are often enough to spur compliance. But sometimes, in order to obtain specific behavior over and above the minimum requriements, legislatures will give benefits in addition to penalties, adding a carrot along with the stick.
Utah has just done so with regard to companies that suffer a data breach. If the data holder creates, maintains, and complies with a reasonable cybersecurity program, including safeguards in a framework at an appropriate scale for the data holder, that can serve as a defense for a suit relating to a data breach.
Utah and Ohio now have such laws; I'd expect a few states (particularly red ones) to adopt similar legislation in the coming years.
Didn't I just say 17? How about number 18?
Access Case #17: CMS continues its string of settlement actions with covered entities that fail to give patients proper access to their medical records. Again, it's a covered entity that got in trouble for failing to give access, got "techincal assistance" from OCR (meaning a complaint was filed, they were contacted by OCR and OCR told them how to fix the problem), but failed a second time to give access to the same patient. This time, the winning covered entity is Arbour Hospitals, a behavioral health provider in Massachusetts; Arbour's "prize" is a fine of $65,000.
As you probably know, on December 10 HHS proposed some HIPAA revisions (which I briefly noted) to clear up permissions for use and sharing PHI for population health issues, as well as some changes to the NoPP requirements (it wasn't published in the Federal Register until January 21, 2021). However, since December 10 was the Trump Administration and January 21 was the Biden Administration, the new administration put a hold on the regulations. Now, they have extended the comment period from it's stated termination, March 22, to May 6. So, if you got comments, but are lazy, you now have some extra time.
On this date in 2002, I posted this item as the first post ever on HIPAAblog. True to my work, my blogging laziness has been unsurpassed. Just how lazy a blogger am I? I am too lazy to stop.
That said, the next two months will mark the 20th anniversary of the publication of the first final version of the Privacy Rule. The true original was published December 28, 2000, in the dying days of the Clinton administration, but was included in the set of regulations suspended for 90 days by the Bush administration. The slightly revised regulations were published March 27, 2001, with an effective date of April 14, 2001 (but a 2-year enforcement moratorium until April 14, 2003). So, March/April 2021 really mark the 20th anniversary of the Era of HIPAA.
So, in honor of my 19 years of blogging, and HIPAA's 20th birthday, over the next 2 months I plan a series of 20 long(ish) posts highlighting HIPAA's history, progress, and mutation, its successes and failures, and it's on-going relevance. I haven't got all 20 posts in my head yet, so there might be some preditions, too (no promises, though).
So, keep checking in. As I've said often on this blog, more to come.
Better Late than Never: Sometime in 2017, a hacker got into Gore Medical Management's information systems and stole files containing PHI of 79,000 patients. They didn't find out until November 2020, when the FBI notified them it found the files on an unrelated computer. 2 bit of good news, one bit of bad news: First, Gore had already discovered the technical issue and corrected it (apparently just didn't know files had been taken). Second, the information did not contain any clinical information. However, it did contain social security numbers.
HHS officially announced yesterday that it will waive penalties for use of online scheduling tools for distributing Covid-19 vaccines. This is consistent with other actions during the pandemic, and really indicative of the way HIPAA works -- there are very few things that "you just can't do," because almost everything is a facts-and-circumstances analysis.
Annual breach reports due this week: If you are a HIPAA covered entity and suffered a "small" (<500 affected people) breach of unsecured PHI during 2020, you need to report the incident to OCR this week if you haven't done so already.
When a covered entity suffers a HIPAA data breach, the patient must be notified without unreasonable delay, and no later than 60 days. If the breach is big, involving 500 or more people, the covered entity must also notify OCR and major media in the area at the same time; if it's less than 500, only the patient needs to be notified immediately, and there's no requirement to notify the newspapers at all. OCR still needs to be notified, but the covered entity is required to notify OCR of all of its small breaches at the same time: during January or February of the next calendar year. The filing is pretty easy, it's mostly fill-in-the-blank and menu-driven choices. Thus, if you had any small breaches in 2020, you need to report them by the end of this week.
Free ransomware protection: The program, which the Center for Internet Security has been offering to public hospitals, is now available for free to any US hospital. It's not an ironclad shield, but it does appear to block malicious domains that are often associated with phishing and other malware.
21st Century Cures Act impact on HIPAA documentation. The Cures Act imposes a lot of general rules designed to prevent information blocking. I just happen to be revising some standard HIPAA documentation (hint: if you're a member ot the Texas Medical Association and use the HIPAA forms provided by them, some slightly revised documents will be rolling out sometime in 2021), and thought it might be a good idea to point out that a couple of semi-hidden provisions of the Cures Act might trigger a good reason to revise some of your documents.
The underlying purpose of the Cures Act, for these purposes, is to prevent "information blocking." While HIPAA is about protecting PHI, it also allows (and sometimes requires) PHI to be shared when appropriate. Many EHR providers intentionally try to limit the ability of their EMRs to communicate with other EMRs (they want to put up hurdles to keep their customers from easily migrating to a competitor EMR), and some health care providers try to prevent patients from sending their PHI to other providers, who they consider competitors. That type of information blocking is the focus of recent rules from CMS and ONC.
There's an obvious tension between HIPAA's requirement to generally prevent uses and disclosures of PHI, and the Cures Act Rules prohibiting most activities that could be considered information blocking (data privacy is, by definition, information blocking). It should be noted that the ONC Cures Act Rule recognizes that nondisclosures because they are prohibited by law (e.g., a general refusal to provide PHI to an unknown requestor due to HIPAA Privacy Rule prohibitions) are not information blocking, the ONC Rule is careful to say that only applies for disclosures that are actually prohibited. Thus, if a provider withholds data because it is permitted to do so, it will be in compliance with HIPAA, but could be in violation of the ONC data-blocking rule. It's tricky.
For health care providers, the general requirement is to not engage in activities that could be information blocking; at its most basic level, if a provider is granting patients access to their records in the manner required by HIPAA, it's unlikely they could be considered to be engaging in information blocking, but it's probably a good idea to make sure your documentation doesn't unintentionally commit you to activities that could be considered information blocking by a disgruntled patient.
Consider revising your BAA: Section 4006 of the Cures Act itself revised HITECH (which revised HIPAA), to include a requirement that might make you want to consider revising your standard form BAA. HITECH now says:
"If the individual makes a request to a business associate for access to, or a copy of, protected health information about the individual, or if an individual makes a request to a business associate to grant such access to, or transmit such copy directly to, a person or entity designated by the individual, a business associate may provide the individual with such access or copy, which may be in an electronic form, or grant or transmit such access or copy to such person or entity designated by the individual."
Due to this, you might consider amending the "Access" provision of your BAA to allow the business assocate to make the disclosure of an individual's PHI directly to the individual or to the person indicated by the individual, if the individual approaches the business associate directly. Most BAAs simply require the business associate to provide the PHI to the covered entity upon request, and many require the business associate to communicate to the covered entity before providing the PHI to the patient. In fact, most business associates don't want to be responsible for making the decision about whether they should grant access to the patient. If you are a health care provider, you should consider revising your BAA to allow the business associate to make the disclosure directly, with a requirement that the business associate notify you if they have done so.
Consider revising your NoPP (all providers): The CMS and ONC Cures Act Rules prohibit covered entities from refusing to disclose PHI if doing so would be information blocking. In other words, if the covered entity is asked to disclose the information and refusing to do so is data blocking, then in fact the covered entity is now required by law (the Cures Act) to make the disclosure. While this might not have a real practical impact, you should consider revising the "required by law" section of your Notice of Privacy Practices to include a reference to disclosures required to avoid information blocking.
Consider revising your NoPP (Medicare/Medicaid hospitals): The CMS Cures Act Rule revises the Medicare/Medicaid Conditions of Participation (CoPs) for hospitals to require that the hospital automatically send electronic notifications upon a patient's admission to (including ER registration) or discharge or transfer from the hospital ("ADT Notice"). The ADT notice should be automatically sent to appropriate post-acute care service providers, as well as to the patient's primary care provider or group and any other provider designated by the patient. Since these notifications will happen automatically, and the patient might be surprised to hear that their primary care doctor (who maybe they didn't like that much anyway) found out they were admitted, or annoyed to get calls from post-acute providers seeking to provide the patient with services, it might be a good idea for hospitals to revise their NoPPs to warn the patients about these disclosures.
Food for thought.
Here are a couple of questions regarding a recent seminar I conducted for Lorman Education Services:
Q: The patient passes away, what do we need from family/life insurance policies in order to release records?
A: The answer depends on the specifics of state law, but the person who is "authorized to act on behalf of a deceased individual or the individual's estate" becomes the "personal representative," and has all rights the deceased person would have had if they were still alive. This is usually the executor or administrator of the estate, or the holder of letters testamentary. If a family member provides court papers that indicate he/she has been appointed executor, then the covered entity should treat that person as if he/she were the patient.
More enforcement discretion: HHS announces that OCR will exercise enforcement discretion with respect to providers who use on-line or web-based scheduling applications in good faith to schedule individual patients for the Covid vaccine. This is in line with earlier Covid-related regulatory relief. On-line and web-based scheduling platforms have been the source of some HIPAA breaches, and if they aren't set up right, can be problematic. Just like Zoom, FaceTime, and the like. But for the same reason, the benefits of speed, low contact, and easy accessability in the time of Covid are worth the risks.
Interestingly, the same day that the Fifth Circuit kicked out a $4.3 million fine against MD Anderson, Excellus BC/BS in upstate NY agreed to a $5.1 million settlement with OCR. Granted, the Excellus breach was much, much bigger and lasted much longer, but it's still a little curious; I wonder if Excellus has "settler's remorse" now?
MD Anderson fought the law, and . . .
MD Anderson actually won. At least at the 5th Circuit. I'll want to read the opinion before I can predict whether OCR will appeal to the Supreme Court, but I think it's likely they will. So, keep in mind that I'm operating a little in the dark here, but would you like my initial take?
Here's the chronology:
Between 2011 and 2015, MD Anderson lost one laptop and two flash drives (actually, the laptop was stolen in a home burglary, and the flash drives were lost by an intern and a visiting research physician. The media had research-related ePHI of 35,000 patients involved in Anderson's research projects. In 2006, Anderson had adopted policies requiring encryption of ePHI, but neither the laptop nor the flash drives were encrypted.
Anderson reported the incidents in 2012 and 2013, triggering an investigation by OCR. OCR stated that they tried to reach an informal resolution with Anderson over the course of their investigation, but were unable to do so. I don't have any inside detail, but it sounds like Anderson might've ignored or rebuffed OCR's outreach efforts, just as Children's Medical Center in Dallas did.
Since Anderson and OCR did not reach a settlement agreement, in March 2017, OCR issued a "Notice of Proposed Determination" in which it imposed a $4,348,000 fine for multiple HIPAA violations, including failure to encrypt (encryption itself is an addressable issue, not a required one, but given Anderson's 2006 policies, they internally addressed it and determined that it was necessary). Anderson challenged the proposed determination, which sent the matter to an Administrative Law Judge. Anderson's defense included that encryption was not required (cf. their own policies), the information was for research so not covered by HIPAA (it's still PHI, and Anderson is still a covered entity), that no known harm was determinied to have come to any of the affected individuals (you still get a ticket even if your reckless driving doesn't cause any accidents), and that OCR lacked the authority to levy fines against state agencies (HIPAA specifically applies to Medicare and Medicaid, and OCR has fined plenty of governmental entities). They also argued that the fines were unreasonable (now, that's an argument I can buy). They later specifically argued that the fines violated the 8th Amendment to the Constitution, which specifically prohibits "excessive fines."
The ALJ upheld the penalty, in relatively harsh words, in June 2018. Anderson appealed inside the administrative law system, to the Departmental Appeals Board, which upheld the ALJ's award. Anderson also appealed to federal court system, seeking a determination that OCR's fine was unreasonable and beyond the authority of OCR to impose. In April 2019, OCR issued guidance, and a Notification of Enforcement Discretion, indicating that it now believed that lower fine limits were applicable; Anderson appealed the DAB ruling to the Fifth Circuit, adding the fine limits to its arguments against the penalty.
In the HITECH Act, Congress authorized OCR to levy higher penalties; however, as with much of the language in the shoddily-drafted and hastily passed ARRA (also known as the Stimulus Bill [or "porkulus if you're a deficit hawk], of which HITECH is a part), the penalty language is poorly drafted. While the Omnibus Rule (passed by Obama's HHS) included adoption of the apparent new higher limits, the Notice of Enforcement Discretion (passed by Trump's HHS) finally recognized this, and instituted a tiered system of penalties, based on culpability. While the Notice of Enforcement Discretion could be read as forward-looking only, its underlying rationale gave Anderson a good toe-hold to fight the fines against it (in my opinion, the only really good argument they had).
Ultimately, the Fifth Circuit determined that OCR's fine was "arbitrary, capricious, and contrary to law;" even OCR has acknowledged that it can no longer defend the portion of the fine in excess of $450,000, under the rationale in the Notice of Enforcement Discretion. The court did not rule on Anderson's argument that it is not a "person" under HIPAA because it is a state agency (if the court had sided with Anderson, that would've made an appeal to the Supreme Court by OCR much more likely).
Obviously, I'll chime back in once I read the actual ruling, if that changes any of the above.
The HIPAA Security Rule requires covered entities to adopt safeguards to protect PHI. To be specific, the Security Rule (mirroring general data privacy principles) requires covered entities to adopt three types of safeguards (administrative, physical, and technical) to protect three PHI qualities (confidentiality, integrity, and availability). Thus, a HIPAA covered entity must structure its operations so that PHI remains confidentiality, is not distorted, and is available when needed. If the covered entity uses a cloud provider to host its PHI and data operations, the covered entity must be sure the PHI (and its business operations with respect to the PHI) will be confidential, not manipulated, and available.
One of the biggest data security risks these days is ransomware. The primary problem with ransomware is that is impacts the availability of PHI (ransomware that exfiltrates data also hurts confidentiality, and the scrambling effect of ransomware technically is an integrity problem, but that's the least of your worries). When a basic ransomware attack occurs, it's a security risk because it prevents the covered entity from having access to its data, which prevents it from using the data in a way that helps the patient. It's an availability issue. That should seem obvious.
You are probably aware that last week there was trouble in DC blamed on Trump (I'm not going to get into a fight with you about whether Trump was to blame, nor whether this was a riot, an insurrection or a "mostly peaceful" protest that got out of hand), and Trump was kicked off Twitter. You may or may not be aware that Twitter is considered to be hostile to conservatives and solicitous of liberals (Ayatollah Khamenei's Twitter account remains active). You may or may not be aware that there are a couple of alternatives to Twitter, namely Parler and Gab, with Parler being the most preferred by conservatives angry with Twitter (full disclosure, I have both a Twitter and a Parler account, and may have a Gab account as well, I can't remember).
Importantly, Parler touts itself as a free speech site. It claims to take efforts to remove accounts that actually incite violence, but it does not regulate speech as heavily as Twitter or Facebook, much less clearly targeting conservative voices as Twitter and Facebook do. Thus, Parler has come into favor with conservative voices seeking an alternative to other outlets that certainly appear (if not being actually) much more hostile to conservative viewpoints. As far as I know, nobody has accused Parler of actually promoting or espousing "bad" opinions (nor is there the least bit of evidence of that), just that Parler failed to police "bad" actors. Again, which may or may not be true.
What you may not be aware of is that Twitter, Amazon, and Apple appear to have initiated a concerted effort to knock Parler off the air. Specifically, Apple has kicked Parler out of the Apple app store (on the day that Parler was the most-downloaded app). But more relevant here, Amazon Web Services (AWS), the cloud hosting site run by Amazon where Parler's data was stored, kicked Parler off the system and locked the company out of its data. Amazon defended its decision to freeze out Parler, blaming Parler for abetting the trouble in DC. Parler is currently dark and non-operational, its business entirely halted while it tries to find another cloud provider to host it.
AWS has made a subjective value-based judgment that Parler is dangerous and should be shut down, because Parler is used by people that AWS deems to be dangerous. AWS has shut a large customer out of its operations because AWS does not approve of the customer's customers.
AWS could make the same determination regarding Parler's law firm. AWS could make the same determination regarding a law firm representing the people who post on Parler, who AWS has determined are so dangerous that Parler must be shut down for hosting them. AWS could make the same determination regarding a healthcare provider who provided care to those people. AWS could make the same determination regarding an insurer that offered health plans to those people.
It's not a far stretch to think that a healthcare system in a red state would be at risk of being shut out of AWS, because its patients are the types of people AWS associates with Parler. It's certainly not a stretch to think that AWS could shut down cloud access to a health plan for a gun manufacturer. Oil companies, Catholic charities, beef farmers, anyone not liberal is at risk.
"Oh, come on," you say, "these are odious people on Parler, all good people would agree they are terrible folks and deserve to be shunned." Well, wait until it happens to you. Once your vendors start making value judgments (and "picking sides," which is what they're doing), all bets are off.
There's no avoiding the obvious conclusion here: if you use AWS cloud services, you run the risk of AWS shutting you out of operations if AWS decides it does not like the patients or beneficiaries you serve.
Thus, as a HIPAA covered entity, you fail to ensure "availability" of PHI if you use AWS. HIPAA requires you to have reasonable safeguards to protect availability of PHI; if you are hosted by AWS and get shut out, your PHI is not longer available; it's not reasonable to not protect against that possibility.
Final result: using AWS may be a violation of HIPAA, because it an unreasonable risk to availability.
First OCR settlement of 2021 continues a trend started in 2020: fining a covered entity for failing to provide an individual with access to his/her medical records. HIPAA provides 6 primary rights to individuals, one of which is the right to access their PHI. This has been a focus for OCR, and we start 2021 with Banner Health paying $200,000 for failing to provide a patient with timely access to her PHI on 2 different occasions.
Keep in mind that the HHS and ONC data blocking rules provide a parallel obligation to provide patients with access to their PHI (at least for providers, and likely for plans that use electronic medical records as well). So, failing to provide access can get you in trouble a couple of different ways. Many of these access issues are process problems and not conscious efforts to keep data away from patients (although some might be punitive), but that won't matter when OCR decides to fine you.
Now's a good time to start looking at your medical records office and how seamlessly they get requested records out. Don't send out what you shouldn't, but don't hold back what you should be sending.
As I noted in early December, I had back surgery, and was out for a couple of weeks. Follow that up with the Christmas and New Years holidays, making for a couple of 3-day weeks, plus the fact that I've been less than optimal physically due to the back pain that precipitated the surgery, and I've fallen behind in my blogging duties. I'll try to catch up this month, and here's my first installment of "what I shoulda told you a couple of months ago." And don't worry, I'll report on the NPRM soon enough; suffice it to say, it's small potatoes, but if it becomes final, you'll have some paperwork to do.
Aetna Settles 3 breaches from 2017 for $1,000,000: These included (i) PHI-containing web services that were internet accessible without passwords or credentials; (ii) a mailing to HIV patients using window envelopes that allowed the words "HIV medication" to show through the window, and (iii) another mailing to research participants with the name and logo of the research study on the envelope. The resolution agreement and action plan, which includes an implementation report and 2 annual reports, are here.
Ransomware and other security incidents are on the rise, and disproportionately affecting healthcare entities: The FBI specifically warned of a wave of cyber-attacks specifically directed at the healthcare industry. It's Russians using Ryuk. And the potential for such an attack being lethal was made clear in September when a German hospital suffered a hacking incident and a patient died as a result. All of this came out about the same time as new news of the particular vulnerability of IOT-connected medical devices. As far as we know, other than in movies and spy novels, nobody's hacked the pacemaker or insulin pump of a corporate executive or politician and demanded ransom, but it's clearly possible.
PACS server vulnerabilities: I already discussed the PACS system issue, but if you want to read some inside baseball on this, here's a researcher discussing how he was able to access petabytes of x-ray, MRI, and CT images, without hacking, over the internet. It's not for the faint of heart -- he showed that not only could you view images and steal data, you could upload fake images to these PACS systems. Hat tip: Joel Lytle.
City of New Haven (CT) fined for failure to terminate former employee's access to PHI: The city's health department, which operates a clinic, didn't remove a former employee's credentials to access its medical records, and the employee snooped into about 500 files before being discovered. The PHI included STD test results (lovely). The city had not done a risk analysis (of course), and under the resolution agreement paid a fine of over $200,000.
That's enough for now, more later.
It's relatively a small fine ($36,000) but I suspect Elite Primary Care would rather keep the money than pay it. OCR recently issued it's 13th enforcement action against a covered entity for failing to provide patients with access to their medical records. The provider is located in southeast Georgia. The Resolution Agreement is here.
OK, new HIPAA regs dropped tonight. Unfortunately, I'll be having surgery tomorrow, so it'll be sometime next week before you get my analysis. I'm sure it'll be worth the wait.
Of course, feel free to peruse the 357 pages of regs yourself; I'd be happy to hear your interpretation as well.
It looks like Kalispell Regional is trying to settle a class-action lawsuit against it related to a 2019 breach involving 130,000 patients. Hackers got in via phishing emails, and were in the system for months before the hospital noticed. 250 Social Security Numbers were stolen. The incident resulted in a suit by a victim, alleging that KRH failed to take reasonable steps to prevent the hack, the proposed settlement has a dollar amount of $2.4 million.
What makes this interesting is that class action lawsuits as the result of data breaches usually crash in flames. It's hard to prove damages, each victim is victimized in a somewhat different way and has different damages, and other factors make these tough for plaintiff's lawyers to cash in on.
But don't be fooled by the headline: This is just the establishment of a fund to potentially pay out up to that amount. The only things to be paid are actual provable damages (which are hard to find, prove, and show), up to 5 hours of your own time (at $15/hour, so a max of $75) dealing with the mess. Ultimately, KRH will spend a lot less than $2.4 million.
Ransomware and Cybersecurity Risks are High During the Pandemic: Despite all the news and warnings, I'm not seeing a huge increase in ransomware attacks; maybe it's happening but I'm not seeing or hearing about it, maybe it's a fair amount of "crying wolf," or maybe we've just been lucky overall, so far. Either way, whether it's a rampant threat or just a common one doesn't matter that much. If you get hit by ransomware, it's gonna hurt you business, it's gonna hurt your patients, and it's gonna hurt you financially, both in dealing with the event and dealing with the legal aftermath, including potential fines, lawsuits, and reputational damage.
A German hospital was subject to a ransomware attack, and had to divert an incoming patient, who died en route to another hospital.
Hospital workers at Hennepin Healthcare got caught snooping on George Floyd's medical records, and were fired. That's the appropriate response.
Hat tip: Ron Holtsford
PS: sorry for the light posting of late -- having trouble even getting into Blogger.
Recent HIPAA news and notes: I should've posted this 2 months ago; left myself a note but lost it. Well, now I've rediscovered it.
REcently the US Cybersecurity and Infrastructure Security Agency joined with its counterparts from Austrilia, New Zealand, Canada and the UK to issue a Joint Cybersecurity Advisory. The JSA highlights some technical approaches entities with sensitive data might take to prevent hackers and others from targeting them with malicious activity. You might find it useful.
Ransomware update: exfiltration is becoming common: I just read a very interesting article from the Crypsis Group on recent ransomware activity. I'm no techie, so the discussion of TTP's (Techniques, Tactics and Procedures) was a little much for me, but the underlying takeaway was pretty disturbing: about a quarter of all ransomware attacks now also include data exfiltration. That dramatically increases the reputational harm that's possible; being unable to serve your customers because your data is locked up is embarrasing, but having your customers' data distributed "in the wild" is much worse. But it also virtually ensures that reporting would be required if PHI is part of the data.
I've long held, despite OCR's original guidance (later softened), that a ransomware event that does not involve exfiltration of data is very unlikely to require reporting: the corollary would be that a person who changes the locks on your doors without disturbing the contents of your house isn't a thief, so a person who encrypts your data but doesn't look at or take it doesn't count as a "breach" under HIPAA. Most early ransomware variants did not exfiltrate data; the threat actors just wanted to hold your data hostage, not actually see or acquire it. Sure, that does result in a loss of "availability," which is a Security Rule issue, but it's not an "unintentional access, acquisition or use" for purposes of HIPAA's definition of breach (nor should it be). Your confidence level of lack of exfiltration must be virtually absolute, though; a tie goes to the runner, so if there was a reasonable possibility of exfiltration, you'd need to treat it as a breach. (And I don't need to remind you, this is not legal advice -- if you have questions about an incident you suffered, hire good HIPAA counsel).
However, if exfiltration is going to be common, it's going to be hard not to report a ransomware attack.
Defino and Dissent: If you follow me on Twitter these days, you know I'm not too happy with the sorry performance of the vast majority of American media. I'm not alone: the American public actually trusts Donald Trump to provide honest information about Covid more than they trust the media to honestly report on it. That is shocking, and ultimately very troubling for our democracy.
But even though the media at large is full of "idiots," that doesn't mean there aren't some exceptions that prove the rule. Two of my favorite people in the "media" side of data privacy reporting are Theresa Defino, reporter for Report on Patient Privacy, and the mysterious "Dissent Doe," who can be found moderating and populating the website www.databreaches.net (subtitle: "The Office of Inadequate Security")and posting on Twitter under the handle, @PogoWasRight.
Dissent achieved true HIPAA fame when she was recently named in an OCR settlement agreement -- of course, it was one involving the notorious hacker group The Dark Overlord, so it was a pretty interesting situation all around.
Anyway, Theresa had the opportunity to write up a profile of Dissent, which you can see here. Although I don't always agree with them, these are two good people in the HIPAA space.
More OCR fines for failure to provide access: As I noted earlier, OCR has been on a tear lately, fining covered entities for failing to grant patients access to their PHI. Last week, they announced their eighth access-related settlement this year, with Phoenix-based Dignity Health's St. Joseph Hospital paying a $160,000 fine.
In addition to laying down rules on when PHI can be used or disclosed, and rules on how PHI must be secured, HIPAA also grants individuals 6 specific rights with respect to covered entities. While the right to receive a Notice of Privacy Practices is really the most important (it's the disclosure of the rules of the road that the covered entity will abide by), the second most important is probably access. With a few carefully-carved exceptions, patients have the right to access their PHI if it's held by a covered entity. The covered entity may have the right to own and control its own business records, but the information contained in those records also belongs to the patient. Covered entities who jealously guard the information and "block" it from being obtained when needed might also have issues under the recent data-blocking rules. More to come on that front. . . .
UPDATE: Number 9: NY Spine Medicine pays $100,000 fine for failure to provide a patient with timely access to her medical records. These are substantial fines for doing what are pretty stupid things.
UPDATED again to fix the link
Why are people asking if HIPAA protects President Trump? Because the country is overrun by idiots and members of the press (but I repeat myself) who think laws should give protection to people they don't like.
There was much talk about this earlier in the pandemic, but it's clear that HIPAA allows covered entities to notify 911 operators and other first responder entities of the identity of covid pateints, so that the responders can protect themselves and others. OCR even issued guidance regarding the matter. States responded differently, some with greater disclosures and others (Tennessee for example) with more restrictions. Louisiana was one of the more freely-disclosing states, but apparently they have recently stopped the flow of information.
Of course, there's a legimate question about whether sharing that information is really necessary: it could help protect both the first responder and anyone else the infected individual came into contact with (imagine an infected patient being put in a crowded jail instead of isolation, since the police didn't know they had Covid), but comes at the potential cost to individual liberty of an invasion of privacy.
Of course, that's the same argument about masks. In both cases, it should be a balancing act, but certain people are guns-out in favor of protecting liberty in one instance and equally guns-out in favor of government overrunning individual liberty in the other.
If you're adamantly pro-mask and adamantly anti-sharing-data-with-first-responders, you should at least recognize the inherent inconsistency.
Apparently all of USH's 250 hospitals were affected by the big malware attack.
CHSPSC is Community Health System's management service organization, which provides business management, IT, and HIM services to hospitals and physician practices. That makes them a Business Associate. They got hacked by an APT from a hacker group in 2014, and the hackers got access to and absconded with PHI on over 6 million patients. The FBI reported it to CHSPSC in April, but they didn't get the hack fully shut down until August. Guess what? No risk analysis, no info systems activity review, insufficient access controls (the hackers got admin access, so this one isn't necessarily fair, but the lack of activity auditing woulda cured this), and insufficient security incident procedures. Fine: $2,300,000.
UPDATE: As is usually the case these days, reportable data breaches under HIPAA are also state-law data breaches, subject to fines from state attorneys general. Such is the fate of Community Health System and its management company, CHSPSC. Fine to the state AGs: $5 million.
Permera, the biggest insurer in Alaska and Hawaii, suffered a phishing attack that managed to install advanced persistent threat malware, resulting in the breach of PHI of over 10 million people, including social security numbers, bank account numbers, and health informtion. Being a victim isn't a HIPAA problem, unless you become a victim by your own fault. Here, Permera had not conducted an enterprise-wide risk analysis, and had no risk managment plan. Those are the facts that account for the size of the fine, not the fact that hackers got in (although, if they had a risk analysis and risk managment plan, they might've limited the damage from the hack, or even prevented it.
I should've noted this Monday when I found out, but news came out this week of a big fine for a HIPAA breach. Athens Orthopedic first heard from a journalist from www.databreaches.net (that journalist would be my friend, the inestimable Dissent Doe, also known as @PogoWasRight on Twitter) that a notorious hacker group, that goes by the handle TheDarkOverlord or TDO, had access to their patient records and was pulling out data and selling it. TDO promptly followed up with a ransomware demand.
So why the big fine? Athens Orthopedic had not done a risk analysis and had no HIPAA policies and procedures in place. Would a risk analysis and cybersecurity plan have kept TDO out? We'll never know for sure, but it might have, and that's enough.
How's your cybersecurity? Go grab a copy of your last risk analysis. Is it over a year old? Might want to consider an update. What do you mean you can't find it? You're sure you did one but just can't locate it? That won't fly with OCR. Got an extra million bucks for a fine?
A couple of news items from earlier this week point out how cybersecurity and ransomware are particularly problematic for the healthcare industry:
Blackbaud is (was?) one of the nation's largest service vendors to charitable institutions, helping them manage their donor lists and fundraising efforts. They were subject to a ransomware attack that might've hit the mother lode of data, mainly on donors to these charities, but also to some of the beneficiaries and/or customers of the charities. Obviously, some non-profit healthcare institutions were likely to get caught up in the mess, and the dominoes are starting to fall: Minnesota Children's (160,000 donors/patients) and Allina Health (200,000) have reported that they are victims,
ONC has announced updates to the Security Risk Assessment framework that OCR encourages HIPAA covered entities to use in conducting their risk assessments. Remember, conducting a risk assessment is a required Security Rule safeguard; since you gotta do it, you might as well do it right. I highly recommend poking around in the tool, even if you aren't actually doing an assessment, because it makes you think about your own data security. Very useful help, especially from a bureaucracy.
Yesterday OCR announced 5 new settlements involving covered entities that failed or refused to provide patients with access to their PHI, as required by HIPAA.
In addition to restrictions on uses and disclosures of PHI, HIPAA also grants individuals 6 rights with regard to their PHI. While the capstone is the right to receive a Notice of Privacy Practices (an explicit recitation of the "rule of the road" that the covered entity must comply with), the second-most important is the right of individuals to access their own PHI.
In my opinion, the OCR statement is good in a number of ways; first, it gives some specifics of how the various entities failed, several of which had multiple opportunities to fix the problem without paying a fine but failed to take effective action. Secondly, the fines are reasonable, given the crimes. Too often, OCR hits only a few offenders and levies astronomical fines, in the apparent hope that others will learn by example; I think they would do better with more, but lower dollar, fines.
I don't particularly agree either with the premise or conclusion of this WSJ article (probably paywall protected). HIPAA works very well for what it does. It's not a all-health-information-gets-privacy law, because that's unworkable and unreasonable. People exchange health information all the time. A common greeting is, "how are you," which is a question about your health. If I see you walk with a limp or with an arm in a sling, or even just looking pale, your appearance has conveyed health information to me. Some information about you that's not directly related to healthcare can contain bits of health information (what you buy at the grocery store or order at a restaurant says something about your health).
Sensibly, and consistent with American jurisprudential practice, HIPAA only tried to govern the specific area where privacy of health information is and should be protected -- within the healthcare system. Is this new regulatory scheme going to try to govern every exchange of health information?
Do you have questions about the HIPAA impact of the use of mobile health apps? Can you/should you use one? Which ones to choose? Is the app provider your business associate? How does the use of an app implicate your obligations to provide access, amendment, an accounting of disclosures?
Well, OCR is actually going to help you out (a little) with a page dedicated to healthcare apps. They can't answer all your questions (some are just "it depends" or "you need to investigate and decide for yourself"), but there is a lot of good information that will help guide you as you consider new technologies and solutions.
Interestingly, as this article by Sidley points out, through the first 3 quarters of 2020, it appears that OCR has only issued 3 major settlements involving HIPAA, all of which involve Security Rule issues. All involved breaches: one stolen laptop, one hacked email (phishing,I'm sure), and one settlement that could've been avoided if the provider had simply accepted the help OCR offered (see the Children's Medical Center of Dallas settlement of a few years ago for a similar example of failing to grab the proffered lifeline).
Why so few? You'd have to ask OCR, but I think the pandemic is the primary cause. First, the pandemic and the response to it have required creative solutions, and OCR is likely trying to tread lightly and grnt lots of leeway to those who are trying to do good but instead fail. Also, due to the pandemic and preparations such as ventilator rationing strategies and other potential overflow triaging, OCR's current focus has been on the "civil rights" side of its mission -- making sure those rationing and triaging strategies don't violate the civil rights of certain vulnerable populations. Regardless, barring egregious circumstances, I think OCR will continue to eschew the whip hand, and offer a helping hand instead.
Interesting article highlighting recent actions by several large radiology organizations. Recent technological advances have made optical character recognition (OCR) (*uh, not that OCR) more pervasive. This is the technology that allows you to search a PDF for a particular word. OCR wasn't originally smart enough to use on images (the program would spend too much time trying find words in someone's face that it would bog down before it got to "Hello, my name is Bob" on the name tag), but it's gotten better apparently.
Due to the high unlikelihood that anyone would try to scan images for text, radiologists and others haven't been as careful with where they store and transmit medical images as they are with medical documents. Now that OCR is available at scale, and can be operated through a search engine, the tiny patient name or other identifier in the corner of an x-ray might be much more easily discoverable.