[ Wednesday, December 13, 2017 ]
Jeff [7:22 AM]
The city had some sort of program providing services to citizens with HIV, and after the program terminated, the city shared information on 200 HIV patients with the University of Southern Maine to help determine if there were gaps in the way it provided the services, or if it could have operated the program better.
The city claims the data sharing did not violate HIPAA because it was for research purposes,
and it may be right, but probably only if USM had an independent review board determine that the university program had enough protections in place that patient authorization was not required.
Nevertheless, the city has apologized. Perhaps not illegal, but perhaps not a good idea either.
[ Monday, December 11, 2017 ]
Jeff [5:43 PM]
[ Thursday, December 07, 2017 ]
Henry Ford Hospital Breach:
Jeff [12:58 PM]
Someone apparently phished the email credentials
of multiple employees. No word yet on what was accessed or if any of it was used inappropriately.
An Unintended Consequence of Data Breach Reporting? Patients are more and more reluctant to share PHI with their own providers
Jeff [10:37 AM]
I've said many times that privacy exists on a continuum, particularly in regards to health information. On one end, you have perfect privacy, but that means no one (not your doctor, not your spouse, not your friends) has access to your health information. Obviously, the privacy is perfect, but you won't get healthcare unless you can do it yourself. At the other end is zero privacy: everyone knows every medical fact about everyone else. Here, you'd get great healthcare, since you could compare everyone's treatment experience to determine what would be best for you. And think of how far medical science could go with all that data.
At one end, great privacy and lousy healthcare; at the other, great healthcare but lousy privacy. I don't know about you, but I don't want to be at either end; I want to find the happy medium.
That's something healthcare regulators need to think about. Forcing the publicization of inconsequential breaches instills a false sense of risk and danger that is often more dangerous than the risk of harm from the breach itself.
[ Tuesday, December 05, 2017 ]
New from OCR: Five steps
Jeff [3:33 PM]
to prevent insider data breaches.
[ Tuesday, November 28, 2017 ]
Jeff [11:16 AM]
I'm not technologically knowledgeable to know if this is a big deal or not, but if you use OpenEMR, you should definitely have your IT staff take a look at whether this alleged vulnerability
might affect you.
[ Sunday, November 26, 2017 ]
Are Changes Coming to the Wall of Shame?
Jeff [11:24 AM]
HHS is considering
shortening the listing period, and might make other changes. The website is a required element of the HITECH Act, so they can't delete it entirely. But they could (and probably will) make some changes. In addition to shorter listings, perhaps only including listings where the reporting entity was at fault, or at least allow the entity to defend itself, would be useful improvements.
[ Wednesday, November 22, 2017 ]
Jeff [11:15 AM]
Thanksgiving is a good time to think about cybersecurity. Some great tips here
[ Thursday, November 02, 2017 ]
CyberThreat Information Sharing
Jeff [2:32 PM]
: HHS is publicly urging
healthcare industry participants to actively share cybersecurity threat information. Basically, they're urging healthcare players to utilize the benefits provided by CISA (the Cybersecurity Information Sharing Act of 2015) to allow threat information to be publicized across the industry, so players can respond and protect themselves and others. Not a bad idea at all.
[ Thursday, October 26, 2017 ]
Medical Device Cybersecurity:
Jeff [1:30 PM]
I tend to prefer an industry-driven approach
, like the House bill, over a top-down approach like the Senate bill.
[ Thursday, October 12, 2017 ]
Cloud-Based Blood Testing Information Breached: An Amazon cloud data repository
Jeff [12:04 PM]
for blood testing data managed by Patient Home Monitoring was not configured correctly, and a tech security company came across it. 300,000 PDFs accounting for about 150,000 people. Oops.
Using the cloud is OK, but only if you do it right. Be careful . . . .
[ Wednesday, September 27, 2017 ]
Jeff [12:45 PM]
Don't forget to vote for me for best "niche" legal blog. You can go vote here
Jeff [12:36 PM]
I'm not surprised, actually: This is a frightening headline: 73 Percent of Medical Professionals Share Passwords for EHR Access. If you're a medical resident, you used the attending's login information with the attending's consent.
So, it happens. A lot. But not a lot of bad comes out of it, since most (maybe virtually all) medical professionals do the right thing: access only what you need, access only for legitimate purposes, etc.
Still, even residents should have their own login information. You can't audit access if you have password sharing. And if something does go wrong, it could go very, very wrong, and it would be awfully difficult to fix post-facto.
Maybe it's really time for two-factor authentication in many more places.
[ Tuesday, September 26, 2017 ]
Nichey? Or Special?
Jeff [2:26 PM]
Some of my blog readers nominated me for the Best Legal Blog Contest in the "Niche and Specialty" Category. If you feel so inclined, you can go vote here
[ Monday, September 18, 2017 ]
PeaceHealth Data Breach
Jeff [11:47 AM]
: another "employees behaving badly
" breach. Over about 5-6 years, the employee accessed about 2000 records he/she had no need to access. No apparent social security skimming, so not likely to be ID theft. Reading between the lines, that probably means your garden variety snooping. Bad but not horrible. However, the big question is how it took almost 6 years to notice it.
[ Wednesday, September 06, 2017 ]
Nurses behaving badly.
Jeff [11:38 AM]
I guess "Mr. Big" died. This is mildly humorous, but somehow I think the reaction would be outrage if the victim were female instead of male.
H/T Ron Holtsford.
[ Thursday, August 31, 2017 ]
More Window Envelope issues: now it's CVS
Jeff [12:19 PM]
with a problem letting PHI leak out envelope windows.
[ Tuesday, August 29, 2017 ]
Aetna HIV data breach:
Jeff [2:57 PM]
Well, that was fast
. Those class action lawyers can outrun an ambulance.
[ Friday, August 25, 2017 ]
The Trouble with Window Envelopes:
Jeff [2:05 PM]
It's nice to use envelopes where the address of the recipient is only printed on the page inserted into the envelope, but is visible through a window in the outer envelope. It saves costs, as well as reduces the possibility of a mismatch between the information in the insert and the information on the envelope (i.e., the wrong letter gets inserted into the wrong envelope).
However, if you're going to do so, make sure ONLY THE NAME AND ADDRESS show through the window. I think Aetna's gonna be in trouble for this.
. . .
[ Wednesday, August 23, 2017 ]
Cybersecurity Class Action Update:
Jeff [6:17 PM]
One interesting aspect of data breaches (whether HIPAA-related or not) is the potential for lawsuits from affected parties. Most times, injured individuals can't show monetary damages from a HIPAA breach, and that particularly true in non-HIPAA breaches such as the Target or Home Depot data breaches, where any credit card fraud was covered by the credit card companies. (There are exceptions, of course, such as where a HIPAA breach causes harm that can be proven
). But the quest to show that the fear of future ID theft or other harm constitutes actionable damages is the holy grail of class action lawyers, looking to turn the millions of victims (each suffering only minor damages) into a single class so that they can collect on multiplied damages.
So far, it's been tough sledding: most courts deny that there are damages just because you're afraid someone might use your information in the future. That has been recently upheld in this ScottTrade case
. Some day, a court will allow these damages to constitute sufficient grounds for a class action lawsuit, but not yet.
[ Monday, August 21, 2017 ]
Jeff [8:35 AM]
[ Monday, August 14, 2017 ]
Women's Health Care (PA):
Jeff [10:24 AM]
A large Philadelphia-area ob/gyn practice has notified 300,000 patients of a potential data breach
. Not much news on what happened, but it was apparently a hack that penetrated the group's computer system; they don't know for sure if information was actually viewed or extracted, but the information subject to potential breach did include social security numbers (bur apparently not much medical information). The report mentions backups, which makes me think this was probably a ransomware incident. The breach started in January 2017 but wasn't discovered until May 2017, but notifications didn't go out until July 2017 (interestingly, in March the group merged with a NJ group to become the largest ob/gyn group in the country, now known as Axia Women's Health.
[ Wednesday, July 26, 2017 ]
Wall of Shame:
Jeff [2:07 PM]
OCR is updating
its large data breach reporting website.
[ Thursday, July 20, 2017 ]
Peachtree Neurological (Atlanta):
Jeff [10:36 AM]
Peachtree Neurological was hit with ransomware
recently. Fortunately, (i) they were able to restore their systems without paying the ransom, and (ii) there was no evidence that the ransomware exfiltrated any data, thus likely giving them a good reason to determine that the ransomware incident did not constitute a reportable breach (yes, OCR, I'm talking to you).
However, in the course of investigating and responding to the ransomware attack, Peachtree uncovered a more unfortunate fact: some hacker had been camped out in their data for over a year. It does not look like they are able to tell what was accessed or if anything untoward was done, or if the hacker just had access and never did anything. But while the ransomware might not be reportable, this one pretty much definitely is.
Jeff [10:25 AM]
More on the ransomware virus
that disproportionately hit healthcare entities.
[ Thursday, July 13, 2017 ]
University of Iowa:
Jeff [12:29 PM]
Seems like a pretty minor breach
, but some names, admission dates, and medical records were available online.
[ Wednesday, July 12, 2017 ]
Employee Snooping Draws Criminal Charges (St. Charles Health System, Oregon):
Jeff [6:16 PM]
A nursing assistant looked at about 2,500 patients records
; no identity theft or fraud, apparently just idle curiosity. However, she's being charged with misdemeanor computer crimes. Sounds about right -- nice to make a point of how she's dealt with, but not punishing her unnecessarily harshly.
[ Friday, June 30, 2017 ]
Jeff [10:03 AM]
A rural West Virginia hospital
is one of the headline victims of the most recent ransomware iteration, known as Petya (which follows closely on the heels of WannaCry, which had a built-in escape hatch that prevented it from causing too much damage). How do you protect yourself:
Don't pick up the virus. Easier said than done, but you can go a long way just through education of your staff. Almost all of these ransomware attacks come via phishing emails. Don't click, and teach your staff not to click.
Be prepared in case you get hit. If you do pick up the virus (and even the best-protected businesses could be a victim), there's still hope, as long as you're prepared in advance. That means you should do the following ASAP:
- Have good, constant, regular and redundant backups. If you're hit by ransomware and all your data is encrypted, but you can pull an exact second copy of the same data off the shelf, all the cyberattack will cost you is time and a little frustration. But make sure your backups are structured so that you don't end up deleting a good backup and making a backup of your already-encrypted data.
- Practice patch management. Some viruses are "zero-day" viruses, and you might be unlucky to get hit through a vulnerability that hasn't been patched yet. That is extremely, extremely unlikely, but if it happens, you should still be OK if you've done good backups. Most likely, there is a patch available for whatever vector the next ransomware wave exploits, and if you install patches regularly and aggressively, you'll likely avoid being a victim.
- Map your network. If you get hit, you'll need to find out where it came in so you know where to start the cleanup. But before you get hit, mapping might uncover some breaches in your defenses that you can fix now, and that, in and of itself, might prevent you from being victimized.
Be careful out there, and be prepared.
[ Monday, June 26, 2017 ]
Jeff [1:22 PM]
Remember the 2015 Anthem breach
? The one with
up to 80 million individuals'
information compromised? The one
where we think the Chinese were involved, and they got the IT folks to give up their credentials and got sysadmin privileges, so encryption wouldn't have even mattered? Yeah, that one.
Well, Anthem has agreed to settle the lawsuit for $115 million
. Of course, that's a private lawsuit, rather than regulatory action, so there could be some additional payments by Anthem, but this is likely the biggest part.
[ Wednesday, June 14, 2017 ]
Wall of Shame:
Jeff [2:24 PM]
Apparently OCR is considering some changes
to the website listing of all large breaches, based on concerns expressed by a congressman (who also happens to be a doctor) that the listing is too punitive to entities that did no wrong but had to report anyway.
St. Luke's-Roosevelt's Faxing Problem:
Jeff [11:36 AM]
An NYC hospital has been fined $387,000
for two misdirected faxes. That's a big fine. Why?
Three reasons: One, all fines are big these days. OCR still feels it needs to make an impression, and if you've done wrong and get caught, you're going to pay in a big way. Two, the PHI that was disclosed, and whom it was disclosed to, were pretty egregious: it was HIV and STD information (and mental health status), and it was faxed to the patients' employer in one case, and to the organization the patient volunteered for in the other. Three, it happened twice. The case that generated the complaint was the second time a fax had been misdirected, and St. Luke's didn't fix the issue the first time around.
Doing a risk analysis is the thing everyone must do. If you never have a problem, good; just keep re-analyzing on a regular basis, and maybe you'll continue to be so fortunate. But if you do have a problem, treat is seriously and fix it. Give it the attention it needs. Deal with it. Not even OCR expects you to be perfect, and they know mistakes will happen even to the most prepared entity. But you don't get more than one bite at the apple.
[ Monday, June 12, 2017 ]
Hospital Cybersecurity in Critical Condition:
Jeff [7:32 AM]
So says a report
by HHS' Health Care Industry Cybersecurity Task Force. Not particularly surprising.
[ Tuesday, May 30, 2017 ]
Molina, AZ Health Dept Breaches: Molina Healthcare,
Jeff [11:57 AM]
a big player on the insurance exchanges established by the ACA, has reacted to word from Brian Krebs
, cybersecurity expert, that their patient portal has some problems.
Additionally, the Arizona Department of Health Services
has reported a possible breach due to some lost mail.
[ Monday, May 15, 2017 ]
Memorial Hermann: Memorial Hermann
Jeff [12:04 PM]
in Houston had a patient who used a fake ID to get services; the staff called the cops, who arrested the patient. Apparently, the patient was an illegal immigrant (undocumented alien, if you wish, but being an undocumented alien is against the law, hence the word "illegal"). If I recall correctly, Memorial Hermann got hammered in the press for "reporting" this illegal alien who was only trying to get healthcare (actually, steal healthcare by using someone else's ID, but let's not quibble). Memorial Hermann responded to the bad press by issuing its own press release, which (again, if I'm remembering correctly) actually was pretty apologetic about calling the cops on someone who was actually committing a crime.
However, Memorial Hermann put the patient's name in the press release. In fact, they put it in the title of the press release. Sure, they were responding to news reports that had already identified the patient, so disclosing the patient's name didn't increase the stakes any. But, that's still a HIPAA no-no. And they have been fined, big-time: $2,400,000. As the HHS release notes, providing the name to the police was A-OK.
Lesson here: don't name patients if you don't have to. Be extremely careful in responding to bad news or bad reviews -- you can make general pronouncements, but you can't identify individuals.
[ Monday, May 01, 2017 ]
Connecticut Case on Patient-Physician Confidentiality: Interesting case
Jeff [12:14 PM]
, but probably not specifically HIPAA-relevant. HIPAA allows disclosure of PHI under non-judicial subpoenas, as long as "reasonable assurances" are received. It's unclear whether they were in this case, but it's also unclear if there's any HIPAA component to the case at all at this point, given that this is the second trip to the Supreme Court for these litigants.
I do, however, take exception to the comment that "HIPAA is irrelevant." HIPAA may be many things, but it never is irrelevant.
[ Wednesday, April 26, 2017 ]
Maine Psychiatric Center:
Jeff [10:00 AM]
Sorry, I've been busy recently and haven't had the chance to blog about this; still don't, really, but need to get something out there. Thanks to @DissentDoe
for taking the lead on this (if you're on Twitter, read me and don't read her, you're missing out).
When it comes to HIPAA data breaches and the "what's the worst thing that can happen" standard, this is probably it:
hackers attacked and sold on the dark web the personal information of 4,000 patients at Behavioral Health Center in Maine.
If you deal with PHI, you're legally and morally obligated to protect that data, no matter how trivial. Particularly sensitive data doesn't get stricter treatment under the law, but it should under any moral decision-making process.
Please do a risk analysis. That's the lesson from the last few weeks of breaches and settlements. Do it.
[ Tuesday, April 25, 2017 ]
"First Ever HIPAA Settlement with a Wireless Health Service!"
Jeff [2:24 PM]
Feh. This is just
an unencrypted laptop theft by someone without a good Risk Analysis story to tell.
CardioNet provides remote monitoring of patients with severe arrhythmia. An employee had her laptop stolen from her car. It had PHI of about 1400 patients on it, and was not encrypted. Fail.
CardioNet had done some form of risk analysis, and had some risk management policies and procedures drafted up, but never finalized them. Also, they couldn't produce final policies and procedures for any safeguards. Fail again.
Net result: $2.5 million. That's real money, folks.
That being said, "wireless" is a red herring. They could've been a brick and mortar business and still lost an unencrypted laptop. Being a wireless company is just coincidence.
[ Friday, April 21, 2017 ]
It's Hard to Violate HIPAA When You're Not Covered By It:
Jeff [1:32 PM]
A New York trial court has ruled
that the New York Organ Donor Network can't refuse to hand over records to a whistleblower because of HIPAA. A disgruntled ex-employee, who claims he was fired for whistleblowing, is seeking records from the Donor Network, which sought to avoid discovery of the records due to HIPAA. The trial judge denied their motion for failing to identify a federal or state regulation that would prohibit disclosure. The Donor Network is not a HIPAA covered entity nor it is a business associate; therefore, structurally, it is not subject to HIPAA, and can't use HIPAA to refuse to disclose data that is discoverable in litigation. Nor did the court accept the Donor Network's argument that even though it's not a HIPAA-covered entity, the information is sensitive and should not be revealed.
A Small Fine:
Jeff [1:25 PM]
one of their smallest HIPAA fines yesterday. Center for Children's Digestive Health, in suburban Chicago, agreed to pay a $31,000 fine for failing to have a BAA in place with its document management and destruction company, FileFax. The press release indicated that the investigation started with an "investigation of a business associate," which is presumably FileFax.
Given the timing (the CCDH investigation started August 2015), it's likely that the entire matter started in February 2015, when someone went dumpster-diving
to collect paper to sell to a recycler. The paper included a lot of medical records from Suburban Lung Associates, another Chicagoland healthcare provider. The recycler let the Illinois AG know, who started an investigation of Suburban Lung, which led to the provider's document management vendor, FileFax. Presumably, OCR was notified and commenced an investigation of FileFax, which led them to discover CCDH as another FileFax customer with no BAA, despite the fact that CCDH had used FileFax since the beginning of the HIPAA era.
I suspect that no PHI from CCDH was known to be improperly disclosed by FileFax, so there's a "no harm" element here that kept the fine down. I also suspect that CCDH has good HIPAA policies and procedures, cooperated fully with OCR, and quickly resolved any outstanding HIPAA violations. This could also be an indication that OCR is interested in some "commodity" style enforcement actions: instead of rare but huge fines for egregious breaches, OCR may be looking to increase the number of settlements while reducing the dollar amounts, to encourage resolution of existing cases and increase compliance by making the possibility of a fine more likely, even though the dollar amount would be lower. $30,000 still stings for a small business.
[ Thursday, April 13, 2017 ]
Metro Community (Colorado): A federally-qualified health center
Jeff [2:42 PM]
falls victim to a phishing attack. The attack is not their fault, and they respond appropriately. All good, right?
Wrong. Even though they did nothing wrong here, they had never done an initial risk analysis. They did a risk analysis after the phishing attack; apparently, even if they had done it before the attack, they still likely wouldn't have been able to prevent the attack. But . . .
HIPAA required them to do a risk analysis. That requirement has been in place since 2005. Even though the lack of a risk analysis wasn't the cause of the breach, the breach revealed the lack of a risk analysis.
And that's a $400,000 fine. OCR even mentions that the fine takes into account the financial situation of Metro Community, which primarily provides care to the poor and underserved in Denver, which means that the fine would likely have been 7 figures otherwise.
Moral of the story: DO A RISK ANALYSIS. Seriously. It's highly likely that I would not know the name of Metro Community today if they had done a risk analysis a year or two ago.
[ Monday, April 10, 2017 ]
Doctors and Bad Yelp Reviews:
Jeff [4:33 PM]
Well, Yelp isn't the only one. There are quite a few social media sites that allow customers to post reviews of businesses. What happens when a reviewer posts a bad review? What can the business do?
In some cases, the business can sue the reviewer, particularly if the business can prove that the review is false. In fact, that just happened
in respect to a couple of jewelers in Massachusetts, where a jewelry store worker wrote a bogus bad review of a rival jeweler.
But it's a lot more difficult for a business owner to fight a bad review if the business is a HIPAA covered entity. While a patient is free to discuss his PHI whenever, wherever, and however he wants, the doctor can't use or disclose any PHI in response; the fact that the patient put the information out there first doesn't change that.
So what can a provider do? Here's a good article
with a few good tips.
I'd also add that you can respond directly on the rating site, but need to do so in a way that does not disclose PHI. For example, if a patient complained (falsely) that she was not allowed to sit in on her 12-year-old's exam, the practice could respond as follows: "While HIPAA prohibits me from discussing any patient specifically, I can say that it is the policy of this practice that we do not provide medical exams to patients under the age of 16 without the parent being in the room. I have reviewed the medical records from all visits to the practice by patients under 16 during the past six months and have not identified any patients under 16 who were seen without a parent in the exam room." This does not disclose any PHI, but does allow the practice to make a general defense of itself.
[ Friday, April 07, 2017 ]
Has Health IT's Rapid Growth Rendered HIPAA Obsolete?
Jeff [12:52 PM]
Of course not. HIPAA is, at its root, conceptual; no new healthcare delivery systems, and certainly no change in technology, can surplant the basic concepts of HIPAA: health data is only worthwhile if it is used, but it is also private and deserves privacy and security; health data should not be used or disclosed except for proper purposes; even though proper uses and disclosures are permitted, individuals retain all other rights in their own health data; and parties that rightfully have access to or possession of health data have certain responsibilities to establish structural safeguards to prevent improper uses and disclosures.
Specific uses, specific rights, and specific safeguards may change, but those fundamentals remain, and the beauty of HIPAA is that its current structure, with scalability and technological and operational neutrality baked in, need not change to accommodate those changes.
A question from the audience:
Jeff [11:24 AM]
Q: At our group therapy counseling sessions, we have the clients sign in on a
sign in sheet that is passed around once group therapy starts. No one but the
clients in group, the therapist, and the billing department sees the sign in
sheet. We are required by the state agency we serve to have a sign in sheet, and since we bill insurance, we need to be able to provide documentation for insurance purposes (proving the patient attending the group therapy session, in case we get audited). The
sign in sheet asks for client's initials, DOB, and time in and out of group, and has to be signed by the person so it is authentic and
can't be said it is forged. A client in group, who is a lawyer, stated
this was a breach of HIPAA. Is it?
A: It’s group therapy; doesn’t person A know the name (or initials) of person B
and person C, without seeing it on the sign-in sheet? Don’t they know
when the person came into the room and left the room? I guess person A
now knows the age person B, and what their signature looks like, but the real
PHI here is the fact that persons B and C are getting therapy, and person A
already knew that, since it's group therapy!
Sign-in sheets and waiting rooms are always places where PHI can
be inadvertently disclosed. Some person’s presence in a waiting room
gives you some implicit information about their health condition, which means
that every waiting room in the world is a potential HIPAA violation. So
what’s the answer? No waiting rooms? Make the waiting room so dark
nobody can see who else is in there? Hand out Halloween masks to everyone
when they come in so nobody can recognize anyone else? Obviously, that’s
silly. And it’s even sillier when the patients in the waiting room then
go into a group healthcare session together, where they get to know even more
PHI about each other.
Instead, a covered entity medical provider should do what it can
to minimize disclosures in the waiting room, while recognizing that some amount
of disclosure is naturally going to occur. Sign-in sheet should not have
any information that’s not necessary, like addresses, social security numbers,
or diagnosis/medical complaint information. When calling patients from
the waiting room, staff should use the minimum information (say “Mr. Prescott?”
when calling the patient in, not “Dak Prescott, quarterback for the Dallas
Cowboys, we’re ready to give you your treatment for your embarrassing
STD”). But none of that would make much of a difference when a group of
folks in the waiting room all come in together to get their healthcare services
as a group, where all the same information (and much more) is going to be shared anyway.
Given that, it sounds like you are keeping the sign-in sheets to
the minimum information. However, if you want to be overly sensitive, you
could have each group therapy member sign a separate sign-in sheet with the
same information (initials, DOB, in/out time, signature), so that nobody sees
anyone else’s PHI. But I don’t think that’s really necessary, if the
information is going to be shared in person anyway.
[ Monday, March 20, 2017 ]
Jeff [3:04 PM]
[ Thursday, February 23, 2017 ]
Jeff [1:46 PM]
HIPAA lawyer Adam Greene was interviewed
at HIMSS, and noted that HHS is close to publishing the regulations implementing the HITECH revisions that allow affected individuals to get a share of the fines levied by OCR. As you should know, there's no private cause of action for a HIPAA violation, so unless a victim of a data breach can prove damages in a regular tort claim lawsuit (which is usually hard to do in a data breach case), there's no financial recovery for them. Only OCR can get money for a HIPAA breach, by fining the breaching entity.
HITECH included a provision, ostensibly to tweak up enforcement actions, that would allow affected individuals to share in the fines levied by OCR.
Will the fact that an individual can get part of a HIPAA fine mean that data breach class actions will be easier to bring? Adam asks, "if [a person] is
considered a harmed individual under HIPAA, should we consider them harmed for
other purposes, too?" Many lawyers have tried bringing class action lawsuits for data breaches, but generally they fail because it's too hard to prove that the victims are actually damaged: someone might use your data, or they might not; if they do, the credit card company might not hold you liable, so you have no damages; and until you can show actual damages, you don't have "standing" to pursue your own legal action, much less a class action on behalf of all of the victims of the same breach. This inability to prove harm prevents the class action from holding.
I don't think Adam's point will come to fruition. Getting to share in the fine doesn't mean you are harmed, necessarily, or at least not in the way of actual monetary damages. Whistleblowers get a piece of the recovery in a Qui Tam case for Medicare fraud, for example, even though they couldn't be plaintiffs directly since they aren't directly harmed by Medicare fraud. I think HIPAA breach victims who get a share of the fine will be more like Qui Tam whistleblowers, and less like "harmed" individuals with standing to bring a class action. But we will see. . . .
. . . . whenever the regulation is actually published. THAT will get a blog post out of me.
[ Wednesday, February 22, 2017 ]
2 Healthcare Data Breaches up 40%, Affect 25% of Consumers:
Jeff [12:20 PM]
According to the Identity Theft Research Center, Healthcare represents one third of all data breaches
, and the number of reported breaches has risen from 780 in 2015 to 1093 last year. Hacking, physical theft of data, and employee error have been leading causes, but expect phishing to be the next big winner.
Meanwhile, an Accenture survey shows that healthcare consumers have a one in four chance of having their health information stolen
and becoming a victim of identity theft. Only a third of victims were notified by the healthcare entity that suffered the breach (hospitals lead the list, followed by urgent care centers, pharmacies, physician offices and insurers); half of victims found out themselves by looking at their credit reports, and the remainder were notified by a governmental agency.
[ Friday, February 17, 2017 ]
Another Day, Another Monster Fine: This time it's Memorial Healthcare System (Florida)
Jeff [12:46 PM]
, with a $5.5 million fine for not following access controls and allowing terminated employees to continue accessing medical records after being terminated. They had policies and procedures to terminate access, but dropped the ball with that employee, who kept accessing records for a year (I suspect the former employee was stealing identities, too). To compound matters, they didn't audit access; if they had, they might've caught the former employee before too many records were accessed.
This is a big fine. These days, they all are. Time to get serious.
[ Tuesday, February 14, 2017 ]
On the News: Some dude
Jeff [2:47 PM]
talking about HIPAA and misdirected faxes.
[ Thursday, February 09, 2017 ]
Interesting case, wrong conclusion:
Jeff [4:33 PM]
University of Pittsburg Medical Center suffered a data breach where 62,000 employees' SSNs and tax data were breached, but a Pennsylvania court has determined that as an employer, it has no duty to its employees to protect data
. The article compares it to the Children's Medical Center of Dallas breach, but that's a different kettle of fish: the Children's breach involved patient data, not employees.
[ Wednesday, February 01, 2017 ]
Children's Medical Center of Dallas fined $3.2 Million:
Jeff [4:31 PM]
Well, this is the first I've heard of this
, which is awfully close to home.
Apparently, a lost unencrypted Blackberry in 2009 and a stolen unencrypted laptop in 2013 exposed a failure to implement and follow risk management plans, including the failure to secure and encrypt mobile devices. Big entities with somewhat obvious problems will result in big fines.
Blogger: HIPAA Blog - Edit your Template