More enforcement discretion: HHS announces that OCR will exercise enforcement discretion with respect to providers who use on-line or web-based scheduling applications in good faith to schedule individual patients for the Covid vaccine. This is in line with earlier Covid-related regulatory relief. On-line and web-based scheduling platforms have been the source of some HIPAA breaches, and if they aren't set up right, can be problematic. Just like Zoom, FaceTime, and the like. But for the same reason, the benefits of speed, low contact, and easy accessability in the time of Covid are worth the risks.
Interestingly, the same day that the Fifth Circuit kicked out a $4.3 million fine against MD Anderson, Excellus BC/BS in upstate NY agreed to a $5.1 million settlement with OCR. Granted, the Excellus breach was much, much bigger and lasted much longer, but it's still a little curious; I wonder if Excellus has "settler's remorse" now?
MD Anderson actually won. At least at the 5th Circuit. I'll want to read the opinion before I can predict whether OCR will appeal to the Supreme Court, but I think it's likely they will. So, keep in mind that I'm operating a little in the dark here, but would you like my initial take?
Here's the chronology:
Between 2011 and 2015, MD Anderson lost one laptop and two flash drives (actually, the laptop was stolen in a home burglary, and the flash drives were lost by an intern and a visiting research physician. The media had research-related ePHI of 35,000 patients involved in Anderson's research projects. In 2006, Anderson had adopted policies requiring encryption of ePHI, but neither the laptop nor the flash drives were encrypted.
Anderson reported the incidents in 2012 and 2013, triggering an investigation by OCR. OCR stated that they tried to reach an informal resolution with Anderson over the course of their investigation, but were unable to do so. I don't have any inside detail, but it sounds like Anderson might've ignored or rebuffed OCR's outreach efforts, just as Children's Medical Center in Dallas did.
Since Anderson and OCR did not reach a settlement agreement, in March 2017, OCR issued a "Notice of Proposed Determination" in which it imposed a $4,348,000 fine for multiple HIPAA violations, including failure to encrypt (encryption itself is an addressable issue, not a required one, but given Anderson's 2006 policies, they internally addressed it and determined that it was necessary). Anderson challenged the proposed determination, which sent the matter to an Administrative Law Judge. Anderson's defense included that encryption was not required (cf. their own policies), the information was for research so not covered by HIPAA (it's still PHI, and Anderson is still a covered entity), that no known harm was determinied to have come to any of the affected individuals (you still get a ticket even if your reckless driving doesn't cause any accidents), and that OCR lacked the authority to levy fines against state agencies (HIPAA specifically applies to Medicare and Medicaid, and OCR has fined plenty of governmental entities). They also argued that the fines were unreasonable (now, that's an argument I can buy). They later specifically argued that the fines violated the 8th Amendment to the Constitution, which specifically prohibits "excessive fines."
The ALJ upheld the penalty, in relatively harsh words, in June 2018. Anderson appealed inside the administrative law system, to the Departmental Appeals Board, which upheld the ALJ's award. Anderson also appealed to federal court system, seeking a determination that OCR's fine was unreasonable and beyond the authority of OCR to impose. In April 2019, OCR issued guidance, and a Notification of Enforcement Discretion, indicating that it now believed that lower fine limits were applicable; Anderson appealed the DAB ruling to the Fifth Circuit, adding the fine limits to its arguments against the penalty.
In the HITECH Act, Congress authorized OCR to levy higher penalties; however, as with much of the language in the shoddily-drafted and hastily passed ARRA (also known as the Stimulus Bill [or "porkulus if you're a deficit hawk], of which HITECH is a part), the penalty language is poorly drafted. While the Omnibus Rule (passed by Obama's HHS) included adoption of the apparent new higher limits, the Notice of Enforcement Discretion (passed by Trump's HHS) finally recognized this, and instituted a tiered system of penalties, based on culpability. While the Notice of Enforcement Discretion could be read as forward-looking only, its underlying rationale gave Anderson a good toe-hold to fight the fines against it (in my opinion, the only really good argument they had).
Ultimately, the Fifth Circuit determined that OCR's fine was "arbitrary, capricious, and contrary to law;" even OCR has acknowledged that it can no longer defend the portion of the fine in excess of $450,000, under the rationale in the Notice of Enforcement Discretion. The court did not rule on Anderson's argument that it is not a "person" under HIPAA because it is a state agency (if the court had sided with Anderson, that would've made an appeal to the Supreme Court by OCR much more likely).
Obviously, I'll chime back in once I read the actual ruling, if that changes any of the above.
The HIPAA Security Rule requires covered entities to adopt safeguards to protect PHI. To be specific, the Security Rule (mirroring general data privacy principles) requires covered entities to adopt three types of safeguards (administrative, physical, and technical) to protect three PHI qualities (confidentiality, integrity, and availability). Thus, a HIPAA covered entity must structure its operations so that PHI remains confidentiality, is not distorted, and is available when needed. If the covered entity uses a cloud provider to host its PHI and data operations, the covered entity must be sure the PHI (and its business operations with respect to the PHI) will be confidential, not manipulated, and available.
One of the biggest data security risks these days is ransomware. The primary problem with ransomware is that is impacts the availability of PHI (ransomware that exfiltrates data also hurts confidentiality, and the scrambling effect of ransomware technically is an integrity problem, but that's the least of your worries). When a basic ransomware attack occurs, it's a security risk because it prevents the covered entity from having access to its data, which prevents it from using the data in a way that helps the patient. It's an availability issue. That should seem obvious.
You are probably aware that last week there was trouble in DC blamed on Trump (I'm not going to get into a fight with you about whether Trump was to blame, nor whether this was a riot, an insurrection or a "mostly peaceful" protest that got out of hand), and Trump was kicked off Twitter. You may or may not be aware that Twitter is considered to be hostile to conservatives and solicitous of liberals (Ayatollah Khamenei's Twitter account remains active). You may or may not be aware that there are a couple of alternatives to Twitter, namely Parler and Gab, with Parler being the most preferred by conservatives angry with Twitter (full disclosure, I have both a Twitter and a Parler account, and may have a Gab account as well, I can't remember).
Importantly, Parler touts itself as a free speech site. It claims to take efforts to remove accounts that actually incite violence, but it does not regulate speech as heavily as Twitter or Facebook, much less clearly targeting conservative voices as Twitter and Facebook do. Thus, Parler has come into favor with conservative voices seeking an alternative to other outlets that certainly appear (if not being actually) much more hostile to conservative viewpoints. As far as I know, nobody has accused Parler of actually promoting or espousing "bad" opinions (nor is there the least bit of evidence of that), just that Parler failed to police "bad" actors. Again, which may or may not be true.
What you may not be aware of is that Twitter, Amazon, and Apple appear to have initiated a concerted effort to knock Parler off the air. Specifically, Apple has kicked Parler out of the Apple app store (on the day that Parler was the most-downloaded app). But more relevant here, Amazon Web Services (AWS), the cloud hosting site run by Amazon where Parler's data was stored, kicked Parler off the system and locked the company out of its data. Amazon defended its decision to freeze out Parler, blaming Parler for abetting the trouble in DC. Parler is currently dark and non-operational, its business entirely halted while it tries to find another cloud provider to host it.
AWS has made a subjective value-based judgment that Parler is dangerous and should be shut down, because Parler is used by people that AWS deems to be dangerous. AWS has shut a large customer out of its operations because AWS does not approve of the customer's customers.
AWS could make the same determination regarding Parler's law firm. AWS could make the same determination regarding a law firm representing the people who post on Parler, who AWS has determined are so dangerous that Parler must be shut down for hosting them. AWS could make the same determination regarding a healthcare provider who provided care to those people. AWS could make the same determination regarding an insurer that offered health plans to those people.
It's not a far stretch to think that a healthcare system in a red state would be at risk of being shut out of AWS, because its patients are the types of people AWS associates with Parler. It's certainly not a stretch to think that AWS could shut down cloud access to a health plan for a gun manufacturer. Oil companies, Catholic charities, beef farmers, anyone not liberal is at risk.
"Oh, come on," you say, "these are odious people on Parler, all good people would agree they are terrible folks and deserve to be shunned." Well, wait until it happens to you. Once your vendors start making value judgments (and "picking sides," which is what they're doing), all bets are off.
There's no avoiding the obvious conclusion here: if you use AWS cloud services, you run the risk of AWS shutting you out of operations if AWS decides it does not like the patients or beneficiaries you serve.
Thus, as a HIPAA covered entity, you fail to ensure "availability" of PHI if you use AWS. HIPAA requires you to have reasonable safeguards to protect availability of PHI; if you are hosted by AWS and get shut out, your PHI is not longer available; it's not reasonable to not protect against that possibility.
Final result: using AWS may be a violation of HIPAA, because it an unreasonable risk to availability.
First OCR settlement of 2021 continues a trend started in 2020: fining a covered entity for failing to provide an individual with access to his/her medical records. HIPAA provides 6 primary rights to individuals, one of which is the right to access their PHI. This has been a focus for OCR, and we start 2021 with Banner Health paying $200,000 for failing to provide a patient with timely access to her PHI on 2 different occasions.
Keep in mind that the HHS and ONC data blocking rules provide a parallel obligation to provide patients with access to their PHI (at least for providers, and likely for plans that use electronic medical records as well). So, failing to provide access can get you in trouble a couple of different ways. Many of these access issues are process problems and not conscious efforts to keep data away from patients (although some might be punitive), but that won't matter when OCR decides to fine you.
Now's a good time to start looking at your medical records office and how seamlessly they get requested records out. Don't send out what you shouldn't, but don't hold back what you should be sending.
As I noted in early December, I had back surgery, and was out for a couple of weeks. Follow that up with the Christmas and New Years holidays, making for a couple of 3-day weeks, plus the fact that I've been less than optimal physically due to the back pain that precipitated the surgery, and I've fallen behind in my blogging duties. I'll try to catch up this month, and here's my first installment of "what I shoulda told you a couple of months ago." And don't worry, I'll report on the NPRM soon enough; suffice it to say, it's small potatoes, but if it becomes final, you'll have some paperwork to do.
Aetna Settles 3 breaches from 2017 for $1,000,000: These included (i) PHI-containing web services that were internet accessible without passwords or credentials; (ii) a mailing to HIV patients using window envelopes that allowed the words "HIV medication" to show through the window, and (iii) another mailing to research participants with the name and logo of the research study on the envelope. The resolution agreement and action plan, which includes an implementation report and 2 annual reports, are here.
Ransomware and other security incidents are on the rise, and disproportionately affecting healthcare entities: The FBI specifically warned of a wave of cyber-attacks specifically directed at the healthcare industry. It's Russians using Ryuk. And the potential for such an attack being lethal was made clear in September when a German hospital suffered a hacking incident and a patient died as a result. All of this came out about the same time as new news of the particular vulnerability of IOT-connected medical devices. As far as we know, other than in movies and spy novels, nobody's hacked the pacemaker or insulin pump of a corporate executive or politician and demanded ransom, but it's clearly possible.
PACS server vulnerabilities: I already discussed the PACS system issue, but if you want to read some inside baseball on this, here's a researcher discussing how he was able to access petabytes of x-ray, MRI, and CT images, without hacking, over the internet. It's not for the faint of heart -- he showed that not only could you view images and steal data, you could upload fake images to these PACS systems. Hat tip: Joel Lytle.
City of New Haven (CT) fined for failure to terminate former employee's access to PHI: The city's health department, which operates a clinic, didn't remove a former employee's credentials to access its medical records, and the employee snooped into about 500 files before being discovered. The PHI included STD test results (lovely). The city had not done a risk analysis (of course), and under the resolution agreement paid a fine of over $200,000.
That's enough for now, more later.
It's relatively a small fine ($36,000) but I suspect Elite Primary Care would rather keep the money than pay it. OCR recently issued it's 13th enforcement action against a covered entity for failing to provide patients with access to their medical records. The provider is located in southeast Georgia. The Resolution Agreement is here.
OK, new HIPAA regs dropped tonight. Unfortunately, I'll be having surgery tomorrow, so it'll be sometime next week before you get my analysis. I'm sure it'll be worth the wait.
Of course, feel free to peruse the 357 pages of regs yourself; I'd be happy to hear your interpretation as well.
It looks like Kalispell Regional is trying to settle a class-action lawsuit against it related to a 2019 breach involving 130,000 patients. Hackers got in via phishing emails, and were in the system for months before the hospital noticed. 250 Social Security Numbers were stolen. The incident resulted in a suit by a victim, alleging that KRH failed to take reasonable steps to prevent the hack, the proposed settlement has a dollar amount of $2.4 million.
What makes this interesting is that class action lawsuits as the result of data breaches usually crash in flames. It's hard to prove damages, each victim is victimized in a somewhat different way and has different damages, and other factors make these tough for plaintiff's lawyers to cash in on.
But don't be fooled by the headline: This is just the establishment of a fund to potentially pay out up to that amount. The only things to be paid are actual provable damages (which are hard to find, prove, and show), up to 5 hours of your own time (at $15/hour, so a max of $75) dealing with the mess. Ultimately, KRH will spend a lot less than $2.4 million.
Ransomware and Cybersecurity Risks are High During the Pandemic: Despite all the news and warnings, I'm not seeing a huge increase in ransomware attacks; maybe it's happening but I'm not seeing or hearing about it, maybe it's a fair amount of "crying wolf," or maybe we've just been lucky overall, so far. Either way, whether it's a rampant threat or just a common one doesn't matter that much. If you get hit by ransomware, it's gonna hurt you business, it's gonna hurt your patients, and it's gonna hurt you financially, both in dealing with the event and dealing with the legal aftermath, including potential fines, lawsuits, and reputational damage.
A German hospital was subject to a ransomware attack, and had to divert an incoming patient, who died en route to another hospital.
Hospital workers at Hennepin Healthcare got caught snooping on George Floyd's medical records, and were fired. That's the appropriate response.
Hat tip: Ron Holtsford
PS: sorry for the light posting of late -- having trouble even getting into Blogger.
Recent HIPAA news and notes: I should've posted this 2 months ago; left myself a note but lost it. Well, now I've rediscovered it.
REcently the US Cybersecurity and Infrastructure Security Agency joined with its counterparts from Austrilia, New Zealand, Canada and the UK to issue a Joint Cybersecurity Advisory. The JSA highlights some technical approaches entities with sensitive data might take to prevent hackers and others from targeting them with malicious activity. You might find it useful.
Ransomware update: exfiltration is becoming common: I just read a very interesting article from the Crypsis Group on recent ransomware activity. I'm no techie, so the discussion of TTP's (Techniques, Tactics and Procedures) was a little much for me, but the underlying takeaway was pretty disturbing: about a quarter of all ransomware attacks now also include data exfiltration. That dramatically increases the reputational harm that's possible; being unable to serve your customers because your data is locked up is embarrasing, but having your customers' data distributed "in the wild" is much worse. But it also virtually ensures that reporting would be required if PHI is part of the data.
I've long held, despite OCR's original guidance (later softened), that a ransomware event that does not involve exfiltration of data is very unlikely to require reporting: the corollary would be that a person who changes the locks on your doors without disturbing the contents of your house isn't a thief, so a person who encrypts your data but doesn't look at or take it doesn't count as a "breach" under HIPAA. Most early ransomware variants did not exfiltrate data; the threat actors just wanted to hold your data hostage, not actually see or acquire it. Sure, that does result in a loss of "availability," which is a Security Rule issue, but it's not an "unintentional access, acquisition or use" for purposes of HIPAA's definition of breach (nor should it be). Your confidence level of lack of exfiltration must be virtually absolute, though; a tie goes to the runner, so if there was a reasonable possibility of exfiltration, you'd need to treat it as a breach. (And I don't need to remind you, this is not legal advice -- if you have questions about an incident you suffered, hire good HIPAA counsel).
However, if exfiltration is going to be common, it's going to be hard not to report a ransomware attack.
Defino and Dissent: If you follow me on Twitter these days, you know I'm not too happy with the sorry performance of the vast majority of American media. I'm not alone: the American public actually trusts Donald Trump to provide honest information about Covid more than they trust the media to honestly report on it. That is shocking, and ultimately very troubling for our democracy.
But even though the media at large is full of "idiots," that doesn't mean there aren't some exceptions that prove the rule. Two of my favorite people in the "media" side of data privacy reporting are Theresa Defino, reporter for Report on Patient Privacy, and the mysterious "Dissent Doe," who can be found moderating and populating the website www.databreaches.net (subtitle: "The Office of Inadequate Security")and posting on Twitter under the handle, @PogoWasRight.
Dissent achieved true HIPAA fame when she was recently named in an OCR settlement agreement -- of course, it was one involving the notorious hacker group The Dark Overlord, so it was a pretty interesting situation all around.
Anyway, Theresa had the opportunity to write up a profile of Dissent, which you can see here. Although I don't always agree with them, these are two good people in the HIPAA space.
More OCR fines for failure to provide access: As I noted earlier, OCR has been on a tear lately, fining covered entities for failing to grant patients access to their PHI. Last week, they announced their eighth access-related settlement this year, with Phoenix-based Dignity Health's St. Joseph Hospital paying a $160,000 fine.
In addition to laying down rules on when PHI can be used or disclosed, and rules on how PHI must be secured, HIPAA also grants individuals 6 specific rights with respect to covered entities. While the right to receive a Notice of Privacy Practices is really the most important (it's the disclosure of the rules of the road that the covered entity will abide by), the second most important is probably access. With a few carefully-carved exceptions, patients have the right to access their PHI if it's held by a covered entity. The covered entity may have the right to own and control its own business records, but the information contained in those records also belongs to the patient. Covered entities who jealously guard the information and "block" it from being obtained when needed might also have issues under the recent data-blocking rules. More to come on that front. . . .
UPDATE: Number 9: NY Spine Medicine pays $100,000 fine for failure to provide a patient with timely access to her medical records. These are substantial fines for doing what are pretty stupid things.
UPDATED again to fix the link
Why are people asking if HIPAA protects President Trump? Because the country is overrun by idiots and members of the press (but I repeat myself) who think laws should give protection to people they don't like.
There was much talk about this earlier in the pandemic, but it's clear that HIPAA allows covered entities to notify 911 operators and other first responder entities of the identity of covid pateints, so that the responders can protect themselves and others. OCR even issued guidance regarding the matter. States responded differently, some with greater disclosures and others (Tennessee for example) with more restrictions. Louisiana was one of the more freely-disclosing states, but apparently they have recently stopped the flow of information.
Of course, there's a legimate question about whether sharing that information is really necessary: it could help protect both the first responder and anyone else the infected individual came into contact with (imagine an infected patient being put in a crowded jail instead of isolation, since the police didn't know they had Covid), but comes at the potential cost to individual liberty of an invasion of privacy.
Of course, that's the same argument about masks. In both cases, it should be a balancing act, but certain people are guns-out in favor of protecting liberty in one instance and equally guns-out in favor of government overrunning individual liberty in the other.
If you're adamantly pro-mask and adamantly anti-sharing-data-with-first-responders, you should at least recognize the inherent inconsistency.
Apparently all of USH's 250 hospitals were affected by the big malware attack.
CHSPSC is Community Health System's management service organization, which provides business management, IT, and HIM services to hospitals and physician practices. That makes them a Business Associate. They got hacked by an APT from a hacker group in 2014, and the hackers got access to and absconded with PHI on over 6 million patients. The FBI reported it to CHSPSC in April, but they didn't get the hack fully shut down until August. Guess what? No risk analysis, no info systems activity review, insufficient access controls (the hackers got admin access, so this one isn't necessarily fair, but the lack of activity auditing woulda cured this), and insufficient security incident procedures. Fine: $2,300,000.
UPDATE: As is usually the case these days, reportable data breaches under HIPAA are also state-law data breaches, subject to fines from state attorneys general. Such is the fate of Community Health System and its management company, CHSPSC. Fine to the state AGs: $5 million.
Permera, the biggest insurer in Alaska and Hawaii, suffered a phishing attack that managed to install advanced persistent threat malware, resulting in the breach of PHI of over 10 million people, including social security numbers, bank account numbers, and health informtion. Being a victim isn't a HIPAA problem, unless you become a victim by your own fault. Here, Permera had not conducted an enterprise-wide risk analysis, and had no risk managment plan. Those are the facts that account for the size of the fine, not the fact that hackers got in (although, if they had a risk analysis and risk managment plan, they might've limited the damage from the hack, or even prevented it.
I should've noted this Monday when I found out, but news came out this week of a big fine for a HIPAA breach. Athens Orthopedic first heard from a journalist from www.databreaches.net (that journalist would be my friend, the inestimable Dissent Doe, also known as @PogoWasRight on Twitter) that a notorious hacker group, that goes by the handle TheDarkOverlord or TDO, had access to their patient records and was pulling out data and selling it. TDO promptly followed up with a ransomware demand.
So why the big fine? Athens Orthopedic had not done a risk analysis and had no HIPAA policies and procedures in place. Would a risk analysis and cybersecurity plan have kept TDO out? We'll never know for sure, but it might have, and that's enough.
How's your cybersecurity? Go grab a copy of your last risk analysis. Is it over a year old? Might want to consider an update. What do you mean you can't find it? You're sure you did one but just can't locate it? That won't fly with OCR. Got an extra million bucks for a fine?
A couple of news items from earlier this week point out how cybersecurity and ransomware are particularly problematic for the healthcare industry:
Blackbaud is (was?) one of the nation's largest service vendors to charitable institutions, helping them manage their donor lists and fundraising efforts. They were subject to a ransomware attack that might've hit the mother lode of data, mainly on donors to these charities, but also to some of the beneficiaries and/or customers of the charities. Obviously, some non-profit healthcare institutions were likely to get caught up in the mess, and the dominoes are starting to fall: Minnesota Children's (160,000 donors/patients) and Allina Health (200,000) have reported that they are victims,
ONC has announced updates to the Security Risk Assessment framework that OCR encourages HIPAA covered entities to use in conducting their risk assessments. Remember, conducting a risk assessment is a required Security Rule safeguard; since you gotta do it, you might as well do it right. I highly recommend poking around in the tool, even if you aren't actually doing an assessment, because it makes you think about your own data security. Very useful help, especially from a bureaucracy.
Yesterday OCR announced 5 new settlements involving covered entities that failed or refused to provide patients with access to their PHI, as required by HIPAA.
In addition to restrictions on uses and disclosures of PHI, HIPAA also grants individuals 6 rights with regard to their PHI. While the capstone is the right to receive a Notice of Privacy Practices (an explicit recitation of the "rule of the road" that the covered entity must comply with), the second-most important is the right of individuals to access their own PHI.
In my opinion, the OCR statement is good in a number of ways; first, it gives some specifics of how the various entities failed, several of which had multiple opportunities to fix the problem without paying a fine but failed to take effective action. Secondly, the fines are reasonable, given the crimes. Too often, OCR hits only a few offenders and levies astronomical fines, in the apparent hope that others will learn by example; I think they would do better with more, but lower dollar, fines.
I don't particularly agree either with the premise or conclusion of this WSJ article (probably paywall protected). HIPAA works very well for what it does. It's not a all-health-information-gets-privacy law, because that's unworkable and unreasonable. People exchange health information all the time. A common greeting is, "how are you," which is a question about your health. If I see you walk with a limp or with an arm in a sling, or even just looking pale, your appearance has conveyed health information to me. Some information about you that's not directly related to healthcare can contain bits of health information (what you buy at the grocery store or order at a restaurant says something about your health).
Sensibly, and consistent with American jurisprudential practice, HIPAA only tried to govern the specific area where privacy of health information is and should be protected -- within the healthcare system. Is this new regulatory scheme going to try to govern every exchange of health information?
Do you have questions about the HIPAA impact of the use of mobile health apps? Can you/should you use one? Which ones to choose? Is the app provider your business associate? How does the use of an app implicate your obligations to provide access, amendment, an accounting of disclosures?
Well, OCR is actually going to help you out (a little) with a page dedicated to healthcare apps. They can't answer all your questions (some are just "it depends" or "you need to investigate and decide for yourself"), but there is a lot of good information that will help guide you as you consider new technologies and solutions.
Interestingly, as this article by Sidley points out, through the first 3 quarters of 2020, it appears that OCR has only issued 3 major settlements involving HIPAA, all of which involve Security Rule issues. All involved breaches: one stolen laptop, one hacked email (phishing,I'm sure), and one settlement that could've been avoided if the provider had simply accepted the help OCR offered (see the Children's Medical Center of Dallas settlement of a few years ago for a similar example of failing to grab the proffered lifeline).
Why so few? You'd have to ask OCR, but I think the pandemic is the primary cause. First, the pandemic and the response to it have required creative solutions, and OCR is likely trying to tread lightly and grnt lots of leeway to those who are trying to do good but instead fail. Also, due to the pandemic and preparations such as ventilator rationing strategies and other potential overflow triaging, OCR's current focus has been on the "civil rights" side of its mission -- making sure those rationing and triaging strategies don't violate the civil rights of certain vulnerable populations. Regardless, barring egregious circumstances, I think OCR will continue to eschew the whip hand, and offer a helping hand instead.
Interesting article highlighting recent actions by several large radiology organizations. Recent technological advances have made optical character recognition (OCR) (*uh, not that OCR) more pervasive. This is the technology that allows you to search a PDF for a particular word. OCR wasn't originally smart enough to use on images (the program would spend too much time trying find words in someone's face that it would bog down before it got to "Hello, my name is Bob" on the name tag), but it's gotten better apparently.
Due to the high unlikelihood that anyone would try to scan images for text, radiologists and others haven't been as careful with where they store and transmit medical images as they are with medical documents. Now that OCR is available at scale, and can be operated through a search engine, the tiny patient name or other identifier in the corner of an x-ray might be much more easily discoverable.
- Patient gives consent. At the time of service, health care providers can obtain written consent from the patient authorizing the release of COVID-19 testing results directly to his or her employer. Unlike other treatment situations, a health care provider may even condition the performance of an employee test on the employee’s provision of an authorization (i.e., the provider may refuse to perform the exam unless the patient executes a valid authorization). See 45 CFR § 164.508(b)(4)(iii).
- Testing falls under HIPAA’s workplace medical surveillance exception. Health care providers may disclose health screening results directly to an individual’s employer when the service was provided at the employer’s request, and the employer needs the information to comply with legal obligations related to workplace health monitoring. The health care provider must provide the individual with written notice that the information will be disclosed to his employer at the time of the service and must limit the disclosure to the findings regarding the medical surveillance at issue. See 45 CFR 164.512(b)(1)(v).
- Testing paid for by employer. If the employer subsidizes COVID-19 testing for its employees, the employer may be entitled to information regarding the specific employees the provider tested and when the testing was conducted. However, this would not entitle the employer to the results of the testing.