[ Monday, August 12, 2019 ]


Interesting article this morning out of Pennsylvania.  A patient has sued Lehigh Valley Memorial Hospital Network (LVHN, which is not LVMH, the luxury brand aggregator), alleging that a doctor on the staff who was not treating him, but with whom he had a business dispute, improperly accessed his medical records.  He's suing the hospital for failing to prevent the doctor from accessing his records.

This raises a number of issues and possible teaching points.

Access Restriction is Required Hospitals do have an obligation to restrict access to PHI to only those persons with a need to access it.  Sometimes this is easy -- an orderly or a maintenance worker shouldn't have access to PHI.  But sometimes it's tricky; a nurse should only have access to PHI of patients he/she sees and treats, but if the hospital prohibits access to patients' PHI other than those assigned to the nurse, and there's an emergency in another department and the nurse must fill in there, the nurse might not be able to access necessary PHI and the patient's health might suffer.  Likewise, doctors on staff should only access the PHI of their patients, but sometimes an emergency consult might be necessary.  A pediatrician would probably never provide care to a geriatric patient, but in many cases lines aren't easy to draw.

Thus, providers must consider whether they can restrict access up front via hard-wired solutions like permitting access only to a set list of patients (or classes of patients).  Often times, they can't, so they then need to set up some other sort of solution.  Usually, this involved a two-part solution: first, the parties seeking access (workforce members like nurses and schedulers, as well as non-employees such as staff physicians at a hospital) must be instructed and trained to only access the PHI of their own patients and never access PHI for which they don't have a permitted need (usually treatment, but possibly payment for accounts receivable or finance employees, and healthcare operations for QA/UR staff).  Secondly, the hospital or clinic then needs to have some mechanism to make sure people are doing what they are supposed to be doing, and not improperly accessing PHI.  This may involve random checks, regular checks, or the use of artificial intelligence or machine learning algorithms to identify potential problem access issues.  The hospital or clinic should then follow up with those whose access seems excessive, and determine if there is a legitimate need.  If not, they need to take follow-up actions with the access abusers -- more training, restricted access, or some sanction, up to and including termination for abusive snoopers.

In this case, the hospital may have been doing the right thing; many hospitals need to allow open access to all physician staff members, and if the hospital had proper training up front and post-access audit controls, it's not impossible that this improper access might have slipped through the cracks.  On the other hand, if the hospital did not train its employees, did not have policies in place regarding access by staff physicians, and did not reasonably audit to look for abusers and fix improper access problems, it may have violated HIPAA Privacy Rule requirement to restrict access.  If the access was to an electronic medical record, the hospital might also have violated the HIPAA Security Rule.

Improper Access May Be a Breach.  Once the hospital knew that the access was improper, it then knew there was a "breach of unsecured PHI," and then had an obligation to notify the patient.  If it did not do so without unreasonable delay (and in all cases withing 60 days of knowing of the breach), it violated the HIPAA Breach Notification Rule.

The doctor accused of improper access might also be liable here.  He apparently claims that he had a patient-provider relationship with the patient, in which case his access to the PHI might have been proper.  Even if he had a patient-provider relationship, that does not give him carte blanche to access the patient's PHI -- the access must still be for a permitted purpose such as treatment or payment (and if it's for payment, it must be limited to the reasonably necessary amount).

Don't Disclose PHI to the Press, Even if it is Already Disclosed I'd also note that both the hospital and the physician have (appropriately) not commented to the press on the matter, but their comments (acknowledging the patient was a patient is, in itself, a disclosure of PHI) were taken out of court filings; generally, disclosing PHI in a court record, where the disclosure is relevant to the litigation, is a permitted disclosure; it appears that the reporter pieced the case together from the court records.  The fact that the PHI is already out in the public record is irrelevant -- just ask Memorial Hermann in Houston. 

Even Unidentified Information Can Sometimes Be Used to Identify Someone It's not central to this particular story, but another interesting point here is that this case shows how de-identifying information is sometimes ineffective, if there are other sources of information that might be leveraged to cross-check and add in identifiers.  The Health Department didn't say who the patient was, but included the date of discharge, which the reporter was able to connect to the court filings.  It's not absolutely certain that the specific patient mentioned in the Health Department report is the plaintiff patient in the lawsuit, but it's pretty likely.

In Litigation, a QPO is Always an Option.  Often, when PHI is used in litigation, the individual who is subject of the PHI will seek to prevent his/her PHI from being in a public record, in order to keep his personal medical issues private.  This can be done with a Qualified Protective Order or QPO, as specifically mentioned in the HIPAA regulations relating to information disclosed subject to a subpoena.  Here, the information in the legal proceeding actually ended up being used by the press to the detriment of the hospital and physician.  I'm guessing that LVHN, and possibly Dr. Chung, are wishing they had used a QPO to protect some of that PHI.

Jeff [1:06 PM]

[ Friday, July 19, 2019 ]


As you should know, while HIPAA has pretty strict rules for most covered entities, those that provide services in the substance abuse arena are often subject to even more strict rules. Called the "Part 2 Rules" since they come from 42 CFR Part 2, they basically prohibit the disclosure of patient information by federally-supported substance abuse centers unless the patient gives specific consent for the particular disclosure. 

As with any privacy rules, the stricter the rules, the worse the utility of the data.  And in the substance abuse arena, allowing patient privacy serves a great good (patients won't be afraid to seek care due to the fear their addiction will be disclosed), but that same privacy can prevent programs from providing the help that patients need.  This can be particularly troubling in the face of the opioid epidemic.

HHS is proposing to revise those rules somewhat to allow better sharing of data between providers.  It will be interesting to see how it plays out; parties from both sides will be likely to weigh in.

Jeff [3:39 PM]

[ Wednesday, July 03, 2019 ]


Here are 6 things small providers could do better to get to better cybersecurity compliance.

Jeff [6:20 PM]

[ Monday, July 01, 2019 ]


Those "obsessed" with privacy (hey, obsession is in the eye of the beholder; one person's reasonable caution is another's obsession) know that if a digital service is "free," the service isn't the product to you; you are the service to the product's developer.  Google Maps is a free product, so it seems; actually, though, you are the product: when you use Google Maps, Google gathers data on you that is uses to sell other products to its customers.  If you don't care about privacy, it's a great deal: you give up privacy you don't want for a free map program.

But if you do care about privacy, what should you do?  Find less intrusive programs.  Some are free but some cost money, but that's what you have to do if you want to protect yourself. 

Anyway, if you're looking for alternatives to the non-private Google products, here's a list.  Think about it. . . . 

Jeff [2:42 PM]

[ Thursday, June 27, 2019 ]


OCR has published 2 new FAQs relating to when and how health plans may share PHI for care coordination with other plans serving the same individuals.  The first question actually alludes to one of the tricky elements of uses/disclosures that are for "health care operations" of a different covered entity: not all "operations" elements are acceptable in those situations.  Care coordination is one of the acceptable elements, though, so that's good.  The second question delves into when an entity can use PHI that it received for a different purpose to tell the individual about other products and services, without the communication being "marketing" (in which case the individual must authorize that use/disclosure).  

Jeff [10:14 AM]

[ Monday, June 24, 2019 ]


Two New York (Southern Tier, Allegany area) health care providers were hit by Ransomware last week.  No word on how the attacks occurred, but I'd guess both started with email phishing schemes.

Jeff [11:36 AM]

[ Wednesday, June 05, 2019 ]


Now, it's LabCorp.  Just days after Quest announces a breach of 12 million patients, LabCorp announces a 7 million patient breach of its own.  Well, not really it's own: like the Quest breach, LabCorp is announcing a breach of its billing vendor, who is the same billing vendor that Quest uses.  

Jeff [2:14 PM]

[ Monday, June 03, 2019 ]


Quest Diagnostics announces a big breach: it looks like a billing vendor, AMCA, suffered the breach, which appears to be a phishing-based email access hack.  It does not look like lab test results were accessed, but billing and financial information (which is still PHI, and would also include some indicia of what medical issues the data subject might have, due to an indication of what tests were ordered and conducted).

Jeff [3:03 PM]

[ Thursday, May 30, 2019 ]


MIE breach brings state fines as well: Yesterday my favorite HIPAA/Privacy reporter tipped me off to the fact that MIE also got fined by state regulators.  MIE is an Indiana-based medical records company, and its clients are spread across the Midwest and elsewhere.  In addition to the $100,000 fine to OCR, MIE also paid $900,000 to a total of 16 states (Arizona; Arkansas; Connecticut; Florida; Indiana; Iowa; Kansas; Kentucky; Louisiana; Michigan; Minnesota; Nebraska; North Carolina; Tennessee; West Virginia; and Wisconsin) to settle HIPAA and state law breaches.

This is a good reminder: you can't only look at HIPAA to determine your obligations to protect data and report breaches; you also must look at state laws. Specifically, all states have data breach reporting laws, and most have either personal data protection/security laws or general "deceptive trade practices" laws that contain a privacy component.  Thus, your data security activities must be HIPAA compliant and state-law compliant, and if you suffer a breach, you must look at both the applicable state laws as well as HIPAA to determine your reporting obligations (some breaches require reporting under HIPAA only, some under state law only, and some under both).

Additionally, since the HITECH Act, OCR isn't the only show in town as far as HIPAA enforcement specifically.  Even if OCR does not fine an entity, a state can do so specifically for a HIPAA violation, but not for a state law violation

In MIE's state law case, MIE paid OCR for violating HIPAA but also paid the 16 states for violations of HIPAA and state laws (i.e., not just state laws).  But, it was an agreed order, so it's hard to tell what would've happened if MIE objected to the fact that, since OCR had already fined them, they should not have state law liability under HIPAA.  I assume the states would've dropped the HIPAA part and relied on state law exclusively.

The final lesson: there are multiple regulators.  Don't forget that.

Jeff [11:29 AM]


Recent OCR activity: Touchstone and Medical Informatics Engineering: If you've been watching the news, you'd have seen a couple of recent HIPAA enforcement actions, with some striking differences.  

First, as I mentioned below, Touchstone Imaging got tagged for $3,000,000 for a server issue that left FTP files exposed to anyone searching the internet.  Then, shortly thereafter, business associate MIE got tagged with a $100,000 fine because a hacker got access to their patient files. Why the big difference?  I'll discuss in a later post. . . . 

Jeff [10:58 AM]

[ Friday, May 17, 2019 ]


The plaintiff's complaint in a lawsuit is only one side of the story, but given that the hospital fired the tech, it's plausible.  Covered entities' greatest risk for a HIPAA violation these days comes from rogue employees.  Whether it's employees stealing credit card information to pay their rent or selling the data to personal injury lawyers, or just not securing data (or losing phones and laptops), a bad employee can cause serious HIPAA damage.

Jeff [11:28 AM]

[ Tuesday, May 14, 2019 ]


Nor should it be. That's not how this works.  

Jeff [5:43 AM]

[ Monday, May 13, 2019 ]


We've heard it over and over, the healthcare industry is the biggest target for data breaches, given the overall value of data, plus the large number of targets with, shall we say, less than stellar defenses.  Here's proof that those indications are right: healthcare leads in total data breaches and total data breached.  

Jeff [11:52 AM]


Anthem Update: As many security folks noted, the big Anthem breach seemed to have a "state actor" flavor to it, and most thought the fingers pointed to China.  Well, 2 Chinese nationals have been charged with involvement, which seems like the likely next step . . . 

Jeff [11:49 AM]

[ Monday, May 06, 2019 ]


Unprotected FTP servers can cause problems, since whomever finds them on the internet can access the data in them.  They aren't easy to find, but they can be.  Of course, when your initial response is that there was no PHI disclosed, when in fact 300,000 people had their PHI exposed, you should expect a fine.  

Jeff [8:49 PM]

[ Monday, April 29, 2019 ]


I haven't seen the actual proposed regulatory text yet, but Modern Healthcare is reporting that OCR will lower the maximum fine level for organizations that violate HIPAA, depending on the organization's level of culpability.  Obviously, OCR could have exercised prosecutorial discretion in levying fines, but it can't hurt to encourage organizations to lower their culpability level.

Jeff [8:37 AM]

[ Wednesday, April 24, 2019 ]


Brookside ENT and Hearing Center in Battle Creek, Michigan got hit by a ransomware attack.  They didn't pay the hackers, their medical records were lost, and they have gone out of business.  The two partners have gone into early retirement.

So, Ransomware can kill you.

Jeff [5:47 PM]

[ Monday, April 08, 2019 ]


in healthcare hacks, at least.

Jeff [1:20 PM]

[ Friday, March 22, 2019 ]


This is a big one.  It also not an OCR settlement, but rather a settlement of a class action lawsuit by affected individuals.  Class action cases are hard to bring, but they got a big settlement here.

Jeff [1:59 PM]

[ Thursday, March 21, 2019 ]


Bad server migration exposed the files.  One of those ftp server issues @JShafer817 is always talking about?

Jeff [7:06 AM]

[ Monday, March 18, 2019 ]


According to reports from OCR.  Email hijacking and ransomware are the leading trouble-makers.  

Jeff [12:46 PM]


Cyber Risk Assessments or Security Risk Assessments ("SRAs") are pretty common in the privacy universe.  In fact, doing some form of an SRA (and regularly repeating/updating) is a required activity for any HIPAA covered entity or business associate.  How do you know what types of safeguards are reasonable and appropriate for your business if you don't understand what your risks are?  However, before you go off and do one, here are 5 questions you should ask.  (One note: I'd add HITRUST to the "frameworks" listed in question 2.)

Jeff [12:43 PM]

[ Monday, March 04, 2019 ]


More information here.

Jeff [7:47 AM]

[ Saturday, March 02, 2019 ]


About the Wolverine breach: my take is in the comments here.

Jeff [1:06 PM]

[ Wednesday, February 27, 2019 ]


Information here.  Looks like your garden-variety email-access phishing attack.

Jeff [10:36 AM]

[ Wednesday, February 20, 2019 ]


Looks like an email-access phishing attack.  A good reminder not to keep PHI in emails, either in the emails themselves or in attachments.  Or encrypt everything at rest.  

Jeff [10:14 AM]

[ Monday, February 04, 2019 ]


Interesting article.

Jeff [10:51 AM]

[ Wednesday, January 30, 2019 ]


Discover noted something funny that indicated that some of its cardholders' information was out on the web, indicating that there had been a breach somewhere.  Discover's notice doesn't contain much information (more on that in a bit), but does indicate that it wasn't their fault.  However, they did replace cards for affected individuals and agreed that they wouldn't be responsible for fraudulent charges (both of which would be true regardless of whether the breach was Discover's or someone else.

Two things to note.  First, many state data breach notification laws, but most importantly and particularly HIPAA, require covered entities to report breaches; the requirement isn't to report your own breach, but to report any breach you discover.  That's the duty of data holders -- if you know someone's data is breached, let them know.  Data breach reporting is not an admission of fault, and most data breaches don't result in fines or lawsuits.  The point of breach notification is not (or at least shouldn't be) to tattle on yourself, it's to help out the public whose data is leaked and who might not know about it or how to protect themselves.

Secondly, it's not surprising that Discovery's notice didn't say too much, like what they found or how they found it.  Why is that?  Because you don't want to give up your data security secrets.  If the black hats learn how you found out something, they might learn how to hide it better.  Especially if you discovered it via some clever means.

Regardless, it's an interesting notice to get in the millions of data breach notifications.

Update: Jon Drummond is no relation (as far as I know), in case you thought so.

Jeff [12:38 PM]

[ Wednesday, January 23, 2019 ]


Oregon wants to pass a law to prohibit the sale of de-identified data without the data subject's consent.  That is dumb -- de-identified data does not have a data subject.  And if it's truly de-identified, there is no downside to its being shared, at least no downside to the data subject (because, again, there is data subject if it's de-identified). 

I understand the "property rights" concept, but it really doesn't work with data.  Data isn't a thing like that; data is a fact, and you can't own a fact.  The exact same data can be possessed by multiple people at the same time, without diminution of the value to any other holder.  Plus the data may only connect to a particular subject in a particular situation.

For example, let's say my birthday is January 1, 1960.  1/1/60 is in my medical record at my doctor's office, which means that data ("1/1/60") is PHI.  Let's also say I went to my doctor today, January 23, 2019 (1/23/19), for my annual physical.  That data ("1/23/19") is also PHI.  Do I own 1/1/60 or 1/23/19?  If those data are my property, can I keep other people from using them?  How about other people who were born on the first day of 1960?  Do they own the data and I don't?  Tenants in common?

Now, I do have some interest in the connection between those two dates, me, and my doctor's office, but do I own all that data as long as it's connected?

More importantly, what if you de-identified it by HIPAA standards?  All you'd know is that some 59-year-old person went to that doctor's office in 2019.  In Oregon, I would still own that data, even though you don't know it's me.  There will be other people aged 59 who come to that doctor's office in 2019, and that data will belong to them; how can you tell which data is theirs and which is mine once it's de-identified?

Even if it's not de-identified, the doctor's office should have some right to the data in its own records.  It should not have unfettered rights to do with it whatever it wants (and it doesn't, because of HIPAA and other privacy laws), but it surely has the right to use the data to run its business. 

I shouldn't complain -- like the Illinois Biometric Privacy Law, this is good for lawyers.  But it's unnecessary and dumb.

Jeff [4:23 PM]

[ Friday, January 11, 2019 ]


A Michigan HIV/AIDS and substance abuse provider has suffered a data breach after a phishing attack.  I suspect this is more of an ID theft issue, but bad news anyway.  Interestingly, (i) no word on how many were affected, and (ii) the breach occurred in April 2018 but notification only went out recently; that could be because the breach was only discovered in the last month or two, but one wonders if the 60-day time limit in HIPAA was met.

Jeff [8:34 AM]

[ Tuesday, January 08, 2019 ]


Mintz has a good wrap-up of some of the bigger HIPAA goings-on from 2018 here.  

Jeff [8:24 AM]

[ Thursday, January 03, 2019 ]


As a bit of an analog to yesterday's post about the impact of a breach on stock price, recently breached companies tend to improve their performance against the market, which might indicate that the breach serves as a "wake-up call" for the company's leadership.  Going hand in hand with that thought, Health IT Security notes that recently breached hospitals tend to increase their advertising spend by 64% after a breach.  

Jeff [1:16 PM]

[ Wednesday, January 02, 2019 ]


It's not as big or as consistent as you might think, but it's not negligible either.  Paul Bischoff and Matthew Dolan have done some research and posted the results here

Interestingly, companies that suffer breaches tend to be underperforming companies anyway.  However, their performance improves after the breach, at least compared to market averages.  Low point tends to be about 2 weeks post-breach, but for the following 6 months, the companies tend to outperform the market.

Maybe suffering a breach serves as a wake-up call?

It's a relatively small data set, and doesn't relate much to small and non-public businesses, but it's interesting to ponder.

Jeff [3:51 PM]


First, from Kirk Nahra.

Then, from Rebecca Herold.

And from HHS itself.

Jeff [12:56 PM]

[ Friday, December 21, 2018 ]


As Baylor Scott & White-Frisco (a joint venture between BSWH and USPI) is finding out, a credit card breach is also a HIPAA breach if it's connected to a HIPAA covered entity.  The incident is similar to one that happened at Banner Health in Arizona a few years ago (reported here and here): a credit card processor vendor suffered a breach, but it involved BSW-Frisco's patients' data. 

Hat tip: Taylor Weems, CIO at Midland Health.

Jeff [1:13 PM]

[ Thursday, December 13, 2018 ]


CMS has asked for public comment on how HIPAA should be changed.  Personally, I'm a "Chesterton's Fence" kinda guy, but I actually think it works pretty darned well as is.  But I'll be interested in seeing the public commentary.  

Jeff [3:55 PM]


When a hospital fails to cut off PHI access to a former employee, it can be a HIPAA violation.  In this case, a relatively inexpensive one (relative being the key word, it's still a lot of money). 

Jeff [3:40 PM]

[ Friday, December 07, 2018 ]


This continues to be the experience of many clients of mine, directly or indirectly in the healthcare field.   Of course, my advice from over 2 years ago is still applicable: patch, isolate, backup, and train (although today I think I'd change the batting order to backup, patch, train and isolate).

Jeff [12:43 PM]

[ Thursday, December 06, 2018 ]


This may or may not be a HIPAA breach, but NY's data breach notification law is likely implicated.  It's unclear whether the agency would be a HIPAA covered entity; it's described as a health provider, but if it doesn't conduct HIPAA-regulated transactions in electronic format, technically it might not be a HIPAA "covered entity." 

Jeff [10:44 AM]

[ Wednesday, December 05, 2018 ]


Here's a case similar to Raleigh Orthopaedic case: Advanced Care Hospitalists hired a guy who they thought worked for Doctor's First Choice Billing to help them with their billing and coding.  Apparently, the guy was a fraud.  But that's not important: what's important is that ACH didn't get a BAA with First Choice, and PHI ended up exposed on the First Choice website.  ACH notified OCR that at least 400 and as many as 9000 patients potentially had their data exposed. 

The breach notification led to an OCR investigation, which revealed a lack of BAA (and, in fact, a lack of a policy to get BAAs).  Upon further review, OCR also found out that ACH had never done a risk assessment either.

Net result: a $500,000 fine.  And a big black eye. 

If ACH had policies and procedures, a decent HIPAA compliance program, and had entered into a BAA with the guy in the first place, but still got snookered because the guy was a fake, they would've still had a reportable breach, but I'm pretty certain they'd be half a million bucks richer (not to mention what they probably spent on lawyers dealing with this, plus the PR hit).  

Jeff [12:59 PM]

[ Friday, November 30, 2018 ]


This is important, and in my (personal, non-legal) opinion an important piece of news relative to one of the biggest issues affecting HIPAA covered entities.

The FBI has gotten specific about one of the current strains of ransomware that is plaguing the healthcare industry.  Of specific importance to note in the HIPAA arena is the fact that this variant apparently simply encrypts the data it finds, and does not extract, view, or send out the data.  That's very important to a ransomware victim, since despite what OCR's guidance has been to date, if there's no viewing or outside transmission of the data, there is not a "breach" as defined in the Breach Notification Rule (45 CFR 164, part D). 

To be a "breach," there must be acquisition, access, use, or disclosure.  In this type of ransomware, the bad actor inserts virus software onto the computer system of the actor, but the bad actor does not access the data.  Any access only happens within the victim's computer system, by the software that is now part of that computer system.  If the virus then send out some of that data that includes PHI to a third party, THEN you'd have acquisition by the third party, access by the third party, and disclosure to the third party, all of which WOULD be a breach.  Likewise, if the virus opens up a door that allows outside third parties to enter the system, and third parties do enter the system, you'd have access and disclosure, which would likely lead to acquisition and use.  However, if the virus does not exfilitrate or allow outside access, then you do not have acquisition, access, use or disclosure.

This is an important distinction.

This is also not legal advice.

Jeff [12:20 PM]

[ Tuesday, November 27, 2018 ]


A patient had a complaint about Allergy Associates of Hartford (CT); he took his complaint to the local TV news station.  The reporter called the practice to ask for a response, and the doctor in question spoke with the reporter (despite the fact that his privacy officer told him to say "no comment" or not respond at all).  That conversation with the reporter disclosed patient PHI in a manner not permitted by HIPAA.  And now, OCR has fined the practice $125,000. 

It's not fair: the patient told the reporter all of his information already, it's in the public domain, he put it in the public record, he publicized it, he started it.  Yes, all that's true.

But it doesn't matter.  The covered entity has the obligation not to use or disclose PHI unless the use or disclosure is permitted by HIPAA.  The fact that the information is already public knowledge doesn't matter, even if the patient himself put it out there.

That doesn't mean the provider can't respond to the reporter at all.  At the least, the practice should let the reporter know that it can't respond with respect to any specific patient due to the prohibitions of HIPAA (and can't even acknowledge that the patient is a patient), unless the patient specifically authorizes the disclosure.  Additionally, the practice can give general information about the practice that doesn't disclose anything about any individual patient.  For example, if the patient falsely complains that it took 20 office visits in 2 months to fix the issue, the practice can state that it researched its records for the last 5 years and did not locate any patient with 20 visits scheduled in a 2-month period (since that doesn't provide any information on any particular patient, it's not PHI).  But you can't say "this patient didn't have 20 visits" because that is PHI.

The playing field is tilted against providers when it comes to patient complaints.  But don't make it worse by responding in a way that violates HIPAA.

UPDATES (other law firms picking up the thread):
Holland & Knight: Eddie Williams III
Drinker Biddle: Sumaya Noush

Jeff [12:46 PM]


Mercy Medical Center-North Iowa in Mason City has notified about 2000 patients of a potential data breach.  Looks like an employee behaving badly. . . . 

Jeff [11:16 AM]

[ Tuesday, November 20, 2018 ]


Ohio has decided to issue a standardized form to authorize of the release of PHI.  The Texas AG did the same thing a few years ago (as a result of what was then called HB 300).  The Ohio regulation is specifically intended to comply both with HIPAA and with the more restrictive "Part 2" rules applicable to federally-supported substance abuse treatment facilities.  The form can be found here; hat tip to Dinsmore & Shohl for the article.

Jeff [9:08 AM]

[ Monday, November 19, 2018 ]


Which is worse, theft and improper disclosures of PHI, or hackers?  Most HIPAA data breaches are the result of either theft (often done by employees) or simple improper disclosures, such as sending data to the wrong location.  While we should all be vigilent against hackers, as far as the number of breaches, they are way fewer.

However, on the other hand, when a hacker hits, he (or she) usually gets a lot more records than your average thief or other recipient of an improper disclosure.

So, quantity of breaches, or quantity of files? 

Jeff [3:57 PM]

[ Wednesday, November 14, 2018 ]


Jeff [3:32 PM]

[ Monday, October 22, 2018 ]


I'm not sure whether this is a HIPAA issue: is Healthcare.gov, the website that facilitates the federally-run state insurance exchanges, a covered entity or business associate?  It's not a plan or provider, and I don't think it's a clearinghouse because it's not involved in transmitting data in connection with transactions.  As far as I can tell, it assists the plans (which are CEs) that sell insurance on the exchanges, so in theory, if it creates, receives, maintains, or transmits PHI in connection with that service, it's a BA.  But does it enter into BAAs with those insurers, or is it somehow exempt because it's a governmental entity?  HIPAA doesn't include any sort of governmental exemption (Medicare and Medicaid are clearly CEs), but did the ACA or its implementing regulations include any exemption? 


Jeff [11:17 AM]

[ Monday, October 15, 2018 ]


It was the biggest HIPAA breach ever, one of the biggest of any sort of breach involving personally-identifiable information: hackers got access to the medical records of almost 80 million people.  While it's still unclear what damage was done, OCR has finally weighed in with how much it'll cost Aetna: $16 million.  That's almost 3 times the previous record of $5.5 million. 

Update: AP story is here.

Jeff [9:44 PM]

[ Sunday, October 14, 2018 ]


Latest development: Aetna pays the NJ Attorney General $365,000 as a fine for the data breach involving the use of window envelopes to send notices to beneficiaries receiving HIV medications.  As noted earlier, the window envelopes allowed the potential disclosure of PHI to unintended recipients.

Update: Aetna also has settled with the AGs of Connecticut, Washington State, and DC.

Jeff [5:06 PM]

[ Monday, October 01, 2018 ]


The SEC has announced an action against a broker-dealer for a data breach that exposed customer financial data.  Not a HIPAA breach, but it's similar in effect and enforcement activities.  The $1 million fine is considered "small."

Jeff [2:06 PM]

http://www.blogger.com/template-edit.g?blogID=3380636 Blogger: HIPAA Blog - Edit your Template