HIPAA Blog

[ Thursday, February 11, 2016 ]

 

Rochester Update: I previously reported on the NY AG fining University of Rochester Medical Center a pittance ($15,000) for a pretty minor patient privacy violation involving a departing employed provider taking medical records of patients she had served.  Of course, that's state law action, and OCR could still act.

News out today is that OCR is not going to take any action against University of Rochester.  Seems like a good call and a good lesson: honest mistakes + quick and honest responses = easy absolution.  Cooperation is a good idea. . . .

Jeff [10:39 AM]

[ Friday, February 05, 2016 ]

 

Lincare (home health agency) loses: Rarely do entities fined by OCR challenge their fines, but when they do, it's an Administrative Law Judge (ALJ) who decides the case.  Lincare was fined by OCR when an employee took home patient records and, when she left the home in the process of leaving her husband, she left the records behind.  Lincare claimed the husband stole the documents in an attempt to get his wife to come back.  OCR fined them.

Key element: "Lincare took a blase approach to HIPAA compliance," and apparently to their response to OCR.

Lincare appealed, but the fine was upheld.  Unsurprisingly.

UPDATE: Here's a cute headline c/o Mintz Levin

Jeff [7:02 PM]

[ Friday, January 29, 2016 ]

 

This is sad.  6-8 people took pictures of this woman in the ER.  You'd think it would be common sense not to do that.

Jeff [4:34 PM]

[ Thursday, January 28, 2016 ]

 

You don't have to be a healthcare company to have a health data breach.  Lots of companies have health data, and if your company has an ERISA employee health benefit plan, the plan is a covered entity, even if the company has nothing to do with healthcare.

The tips at the end are particularly apropos for any company holding data.  Know what you have and why; get rid of data you don't need; encrypt or de-identify data you keep; use smart, layered security; and monitor access.  I'd also add that you should monitor system use and data flows, looking for any unusual activity or communications.  That can indicate that even though your fences may be good, someone's gotten in and might be doing something they shouldn't.  

Jeff [12:38 PM]

[ Wednesday, January 27, 2016 ]

 

Maybe, just maybe: the Phase 2 Audits may start in early 2016.  Then again, they may not.

Jeff [6:15 PM]

[ Friday, January 22, 2016 ]

 

Cybersecurity: Interesting report recently from the Association of Corporate Counsel.  Obviously this shows a great deal of attention, scrutiny, and awareness of the risks of cybersecurity incidents and the need for aggressive and attentive deployment of cybersecurity resources, but I found the page 1 chart particularly interesting: 24% of all incidents are the result of employee error; in second place is the "inside job."  Add in the 5th place finisher, the old lost (or stolen) laptop or other device, and roughly 50% of system breaches are purely internal (#3, phishing, could also be blamed on internal actors, at least those who fell for the trick).

What's the best way to address internal risks?  Training.  Food for thought.

Jeff [8:40 AM]

[ Sunday, January 10, 2016 ]

 

Indiana University Health: A lost flash drive from the ER results in 29,000 patient notices.  Good to know there were no social security numbers on there.  But if it's portable, maybe it should be encrypted.  

Jeff [10:35 PM]

[ Friday, January 08, 2016 ]

 

CMS announces new guidelines on patients right to access their own PHI.  This new year is kicking my butt and I'm getting behind on little updates like this one.  And I haven't even read the proposed final new regs on disclosing mental health information as part of the gun background check system, which came out yesterday (I think).  But I promise that, at some point, I'll look over these FAQs and report back on anything interesting.  Maybe.

Update: not my own analysis, but here's another article.

Jeff [2:42 PM]

[ Thursday, January 07, 2016 ]

 

Massachusetts Court Finds Standing to Sue for Breach Without Showing Actual Damages: Boston Medical Center used a record transcription vendor that posted BMC's patient data on a website for physicians to access; however, access to the website was not password-protected.  Even though there is no evidence that any unauthorized person looked at the data, much less any allegation of actual harm, BMC notified 15,000 patients of the possible breach (likely because they couldn't reasonably determine that there was a "low risk of compromise," since they couldn't prove a negative).

A patient sued BMC, not even alleging that anyone actually viewed the data, just that the fact that it was exposed is sufficient to allow the patient to sue.  BMC moved to dismiss, saying that the plaintiff should have to show damages to be able to have standing to sue.  The judge disagreed, and rejected the motion to dismiss.  It appears that the trial court infers that the notice from BMC is somehow proof that there is a real risk of harm, justifying standing.  That puts a covered entity in a tough position: if they really think there's virtually no risk but want to give notice "just to be on the safe side," they run the risk of opening the door to unharmed plaintiffs (i.e., those without actual damages) to drag the entity into court at considerable expense.

It will be interesting to see if this holds up; it seems to be contrary to the Clapper decision, which requires that some actual harm be alleged.



Jeff [3:50 PM]

[ Tuesday, January 05, 2016 ]

 

California HIPAA enforcement is inconsistent: ProPublica is all over the HIPAA beat these days.  In yesterday's installment, they note that investigations and penalties for HIPAA breaches can vary widely, with some hospitals seeming to attract much harsher penalties while others receive more leniency.

Enforcing HIPAA isn't science, and I would actually expect there to be disparate penalties, because each case is different.  The troubled entity might be the victim of bad luck or outside forces, the entity might be underfunded and thus unable to do as much to prevent problems as its rivals, or one entity might be working hard to fix problems while another seems not to care about correcting deficiencies.  However, it is good to look at these disparities and see if they tell a different story, or if the reasons for the disparity aren't so innocent but are indicative of official favoritism.

Jeff [4:26 PM]

 

Cybersecurity Act of 2015: Congress has passed and the President has signed legislation that attempts to establish a structure whereby the federal government and private sector parties such as large healthcare providers and payors can share information on cyberthreats, and build a common framework for addressing cybersecurity problems.  We'll see how well it works. . . .

Jeff [11:43 AM]

[ Wednesday, December 30, 2015 ]

 

Off topic: Question: What's the nutritional content of craft beer?  This is why people hate the government.  This is a perfect example of over-regulations.

Answer: Who cares?

Jeff [1:25 PM]

 

I found this in my "drafts" in blogger, and should've posted this way back in May 2014 (note how this risk analysis thing just keeps coming up):

BIG HIPAA fine: NY Presbyterian Hospital and Columbia University are paying OCR $4.8 million ($3.3M from NY Pres, $1.5M from Columbia) to settle potential HIPAA violations.  Columbia Medical School physicians serve as the medical staff of NY Pres, and they share a computer network and hospital information system.  A Columbia physician attempted to remove a privately-owned server from the network, and it somehow made patient data available to internet searches.  Neither entity had done a risk analysis to identify all systems containing ePHI, and thus didn't have sufficient risk management processes.  Add to that failure to manage access authorizations and failure to comply with their own policies, and you get a big, big fine.

The lynchpin here is the failure to do a good risk analysis.  That's where it all starts.

Jeff [1:04 PM]

 

HIPAA's Repeat Offenders Often Avoid Punitive Action, say ProPublica and NPR (in a co-produced article).  The article admits that the repeat violators (CVS and the VA get some heavy discussion, although the article notes but then ignores the fact that CVS did pay one huge penalty) tend to be large organizations with widespread operations.  That's true, but what's also true is that their workforces tend to be either low-pay/high-turnover or hard to fire, and a lot of the problems they suffer are not from intentional data thievery or "being evil" but from employees acting out of stupidity, curiosity, or greed (all of which actions are likely in direct violation of well-publicized policies of the employers).

Still, more work needs to be done.  And as has been evident over the last few months with so many big HIPAA settlements being announced, big fines and public announcements do have a ripple effect in the industry and have a tendency to "focus the attention" on fixing issues before they cause damage.

And hidden in the middle of the article is a nice little database tool from ProPublica: HIPAA Helper, which helps you figure out who the repeat offenders are.  You can search the HIPAA "wall of shame" (go to "advanced options") by name of entity, but sometimes the common name of the entity isn't its official name, either of of which could attach to the "big breach" filing.

Two points about CVS: I've actually had issues getting CVS to appropriately deal with the consequences of what they acknowledged was a serious breach of my client's PHI, although I'd say the problem was more with their counsel trying to act tough.  I do know that CVS got tagged for $2.25 million for the Indianapolis drug store dumpster-diving case that also netted Walgreens and Rite Aid $1 million fines each.  I've never been able to figure out why CVS had to pay more than twice as much as the other two drug stores, but my suspicion is that "strategic legal decision-making" might explain part of it (IYKWIMAITYD).

Second, I also know that, in connection with the $2.25 million HIPAA fine, CVS also reached a settlement agreement with the FTC over its lax security of personal information.  In connection with the HIPAA settlement, CVS had to bring in an outside agency to review their privacy and security procedures for 3 years; in connection with the FTC settlement, CVS has to report to the FTC every 2 years, for 20 years, on its privacy and security activities. 20 years is a long time. . . .

Jeff [11:53 AM]

[ Tuesday, December 29, 2015 ]

 

Recent Breaches Highlight Risk of Failing to Conduct Risk Analysis: The American Health Lawyers Association email alert today discussing three recent HIPAA enforcement actions (all of which I've briefly blogged about, below): Lahey Hospital and Medical Center (the hospital affiliated with Tufts Medical School), Triple S Management (a Puerto Rico insurance provider), and University of Washington Medical.  Fines for all 3 totaled $5.1 million.

Lahey involved a stolen laptop; in a twist, it was not stolen from an employee's car, but was actually connected to a piece of medical equipment in the hospital.  Lahey didn't do enough to secure the hardware, partly because it didn't do a good job of tracking the hardware it had.  Triple S had some problems with too much PHI being sent out in mailings, but the real trouble came to light in the subsequent investigation, when OCR discovered a failure to conduct a risk analysis and institute appropriate safeguards.  UW suffered a breach when an employee of a care division downloaded a computer virus; UW had conducted risk analyses (at least in connection with its "meaningful use" attestation), but didn't make sure all operations were covered and apparently didn't make sure all appropriate divisions and operating units were instituting appropriate safeguards.

As the AHLA email alert correctly notes, the unifying factor in these cases is a failure to conduct and/or implement a good risk assessment.  Triple S did no risk assessment; Lahey didn't pick up all of its hardware and ePHI uses; and UW did not ensure that its a risk assessment and safeguards reached all of its operating units.  So:

  1. Do a solid risk assessment;
  2. Make sure you cover all of the places you use and transmit PHI; and
  3. Make sure you cover all of your business units, facilities, and operating divisions.

This should not be news to you.

Jeff [3:16 PM]

[ Monday, December 28, 2015 ]

 

To Tweet, or not to Tweet: or blog, or Facebook, or Instagram, etc.  Social media can be great; keep individually-identifiable information out of it (and remember, if someone knows enough data points - who the speaker is, where they work, dates or time frames - seemingly de-identified data is actually identifiable).  General information is OK, but specific patient communication can easily fall on the wrong side of the line.  Even emailing or texting patients is problematic, unless you're using some encrypted format, and even then you have the "authentication" issue of someone picking up someone else's phone.

Jeff [12:56 PM]

 

100 Million Health Records Hacked.  While the greatest number of breach incidents are still carelessness and stupidity (lost or stolen laptops, phones, flash drives, etc., and employee greed or curiosity), the rise of the medical data hack is what's pushed the number of affected individuals so high.

Hackers gonna hack, and you don't need to be a particularly big player to become a target, so you better have (i) protections in place to keep hackers out in the first place (perimeter security) and (ii) a means to determine if they are in already (usage and activity monitoring).  Nobody expects you to be perfect, and if you can prove that you took reasonable precautions (and are definitely able to "show your work"), you're much more likely to avoid a fine.  

Jeff [12:43 PM]

[ Wednesday, December 23, 2015 ]

 

Can a Business Associate be Liable for a HIPAA Breach When Its Client Isn't a Covered Entity?

That may be the hidden question in what seems like an otherwise unsatisfying medical record breach problem that seems immune from official action by OCR because the medical provider who originally generated the PHI is not an actual HIPAA covered entity.

Here's the case.  Basically, a New Jersey psychology office has filed a lot of collection actions against patients for past-due bills.  The legal filings, which are public records and can be obtained by anyone who asks the court and pays copying costs, include patient bills and other documentation.  The bills include the patient name (of course, which presumably is in the style of the case as well), but also include CPT codes (which define the type of services provided) and diagnosis codes.  These codes are just numbers, but it's easy to look them up on the internet and see what they stand for.  In other words, when the practice sued the patients, it filed with the court, in public records, the psychological evaluation of the patient.  Frightening, no?

The psychology practice needs to file documentation to prove the debt, so the bills generally are appropriate filings.  But the diagnosis information is not needed prove the debt; therefore, including it is probably beyond the "minimum necessary" restriction of HIPAA's Privacy Rule, which says that even though a use or disclosure is allowed, it must be limited to the minimum necessary (unless it's a use or disclosure for treatment, in which case there's no minimum necessary restriction).

Sounds like a HIPAA violation, right?  Not so fast.

HIPAA only applies to "covered entities" (the whole enchilada) and "business associates" (most all of the Security Rule and the parts of the Privacy Rule that derive from the HITECH Act).  "Covered entities" include healthcare clearinghouses, health plans, and healthcare providers who conduct electronic transactions for which HIPAA establishes standards.  Almost every healthcare provider in the country is a HIPAA covered entity, but not all -- if a healthcare provider never conducts an electronic transaction, or only conducts electronic transactions that are not HIPAA transactions (most payment, enrollment, and eligibility transactions), it isn't covered by HIPAA, so it can't breach HIPAA.

Most HIPAA experts believe that if an entity conducts a single HIPAA transaction electronically, it's a covered entity and subject to HIPAA, not only with regard to the patient for which it did the one electronic transaction, but for all patients.  In other words, once a CE, always a CE.  And if you are a covered entity, HIPAA says you shall not use or disclose PHI unless it is an allowed use or disclosure; any PHI, not just the PHI of your patients.  If you are a doctor and hear about a celebrity's health problem, and you then discuss the celebrity's health issue with your friends, you are technically violating HIPAA.  The celebrity isn't your patient?  The health data is public knowledge?  That doesn't matter.  HIPAA says thou shall not.

Apparently, the Short Hills psychology practice is not a HIPAA covered entity, as determined by OCR when a patient complained about the legal filings.  End of story, right?

Not necessarily.  First, the practice may have other privacy obligations, under state law or other regulations like Gramm-Leach-Bliley.  And even though the psychology practice isn't a covered entity, there may be other parties involved in the litigation on the practice's side that could be covered by HIPAA, not as covered entities but as business associates.  I'm thinking specifically of the collection agency and the law firm, but there could be others.

A vendor that provides a service for a covered entity that involves the creation, receipt, maintenance or transmission of PHI is by definition a "business associate."  HITECH made most of the HIPAA Security Rule directly applicable to business associates, and parts of the Privacy Rule as well.  Just providing a service to a healthcare provider usually makes you a business associate, but not always: if the provider one of those rare providers that isn't a HIPAA covered entity, then the vendor providing services to the provider isn't a business associate.

At least with respect to that particular provider.  The vendor could provide services to another provider that IS a covered entity, in which case the vendor is a business associate, and must comply with the Security Rule and parts of the Privacy Rule.  Must a business associate comply with the Security Rule and applicable parts of the Privacy Rule with respect to the non-covered entity client's PHI as well as the covered entity clients?  I can't say absolutely, but I don't see how you can avoid it.

If I, as a lawyer, provide services to a covered entity involving PHI, I'm a covered entity.  At that point I need policies and procedures, and all the safeguards required by the Security Rule.  Those safeguards address how I must protect PHI; it doesn't by definition limit that to PHI I receive from a covered entity, but seems to apply to all PHI.  Might some health data be PHI and other data not?  I don't think so.  That doesn't mean all health data must be equally protected, and perhaps similar data from different clients can be treated differently, but the policies and procedures (including any differences) must be rational and reasonable.

So, the question now is this: is there a collection agency involved here?  Does the collection agency also serve covered entities?  If so, the collection agency is a business associate, and therefore subject to parts of HIPAA: most of the Security Rule, some of the Privacy Rule.  I don't think a business associate is subject to the minimum necessary rule per se (that's in the Privacy Rule, and predates HITECH), but should it be addressed in the business associate's policies and procedures (that are required by the Security Rule)?  If it is addressed there, did the business associate collection agency violate its HIPAA policies?

Same with the law firm.  I suspect the law firm and the collection agency both have some clients who are HIPAA covered entities, thus making each of them a business associate.  Which could be problematic.

As I noted on Twitter earlier today, this is a bit of a gray area, and you'd really have to tease out the facts and run these theories to their logical conclusions.  And, as always, #TINLA ("this is not legal advice").  But, it does raise some interesting angles:

  1. If you can't un-become a covered entity, you probably can't un-become a business associate either (in other words, you only get to lose your HIPAA virginity once).
  2. If you're covered for this but not for that, you may actually be covered for that too.
  3. The fact that you might be able to treat PHI you got from one source differently than PHI you got from another source doesn't mean you should (especially since it's probably not true anyway).

And who said HIPAA was dull?


Jeff [11:46 PM]

[ Tuesday, December 22, 2015 ]

 

3 Tips for HIPAA-Social Media compliance: from Fierce Health.

  1. Don't use PHI in social media
  2. Have a Social Media Policy (and make it known)
  3. Have a strategy for addressing negative reviews

Jeff [3:40 PM]

 

Attack of the Health Hackers: Hacking has overtaken theft/loss/carelessness as the health industry's primary HIPAA breach concern.  

Jeff [9:58 AM]

[ Monday, December 14, 2015 ]

 

University of Washington Medicine: An employee downloads an email attachment that contains malware, and the PHI of 90,000 patients is exposed (including Social Security Numbers of 15,000 people).  The covered entity has policies and procedures requiring the business units to have up-to-date risk assessments and safeguards, but doesn't check to make sure the business units are taking appropriate precautions.  If you're the University of Washington Medicine, that failure gets you a $750,000 fine.  Wow.

Key take-away: You must do a risk analysis, and the risk analysis must be system-wide if you're more than a single entity.  The more complicated your corporate structure, the more complex your risk analysis should be (or at least make sure you cover all your relevant risk areas/entities).

Jeff [7:57 PM]

[ Friday, December 11, 2015 ]

 

Identity Theft: This is probably still the greatest threat to PHI at healthcare entities: simple identity theft by employees.  Considering that a third of healthcare patients may be hacked next year, that's a lot of potential trouble.

Jeff [9:36 AM]

 

Snooping: The urge to snoop is strong.  Covered entities must put stronger restrictions in place, and vigorously punish those who can't resist the temptation.  

Jeff [9:28 AM]

[ Wednesday, December 09, 2015 ]

 

In Hacking News: MaineGeneral Health has been hacked, patients being notified.

Jeff [1:43 PM]

 

"It's Skyrocketing." A report on the current state of medical identity theft.

Jeff [1:39 PM]

[ Thursday, December 03, 2015 ]

 

Off Topic: this may also explain why I run.  

Jeff [10:00 PM]

 

Rochester, NY: Small fine for a small breach.  Brought by the NY Attorney General, a Rochester, NY hospital was fined $15,000 for a breach that occurred when a nurse practitioner left the hospital and joined a private practice neurology group, and brought the records of some 3,000 patients with her to her new employer.

UPDATE: Adam Greene weighs in, as does Cooley LLP.  Texas is different; Texas Medical Board Rule 165 requires the departing physician to notify patients of the physician's departure and tell the patients where their medical records will be.  The rule states that the obligation falls on the departing physician, although obviously it can be fulfilled by the practice the physician is leaving.  While most physician employment agreements state that the medical records belong to the practice, there's no prohibition on allowing the physician employee to retain ownership of records relating to patients he cares for, and taking those records when he leaves, and therefore no prohibition on the departing physician notifying "his" patients of "his" new address, any more than the practice would be prohibited from notifying the patients if it moved offices.  I suspect there was more going on in Rochester, although the fine is small enough it could just be a nuisance settlement.

Jeff [2:51 PM]

[ Monday, November 30, 2015 ]

 

Lahey: Additionally, I failed to note on Wednesday that Lahey Hospital (connected to Tufts Medical School) settled a HIPAA case last week for $850,000.  

Jeff [7:07 PM]

 

Puerto Rico Insurer Triple-S has just settled a HIPAA violation case with OCR for $3.5 million. No link yet.

Update: here's the link to the press release.

Jeff [7:05 PM]

[ Wednesday, November 25, 2015 ]

 

This seems like a stretch: doctors offices entered a wrong fax number and instead of sending data to a lab company (Quest Diagnostics), and the lab company gets sued.  

Jeff [12:08 PM]

 

Yet Another Reason to Boost Your CyberSecurity: it can now impact your credit rating, at least if you're a non-profit hospital.  

Jeff [11:59 AM]

[ Tuesday, November 24, 2015 ]

 

LabMD Update: LabMD is suing the FTC lawyers, in their individual capacity, for bring the case and prosecuting it so vigorously.  Don't think that will work, but you sure can't say LabMD is lying down and taking it.  This will be fun to watch.

Jeff [10:38 AM]

 

Connecticut AG Takes HIPAA Action: As you know, the HITECH Act gave state attorneys general the ability to pursue legal actions for HIPAA violations in their states.  I was just having a conversation yesterday with one of my favorite reporters about the fact that so few state AGs have jumped into this role.  One that has is the Connecticut AG's office, which recently fined Hartford Hospital and its business associate EMC $90,000 because an unencrypted laptop containing the PHI of almost 9,000 patients was stolen from an EMC employee's house.

Jeff [8:42 AM]

[ Friday, November 20, 2015 ]

 

Looking for a good cybersecurity seminar and training session?  You might want to check this out.  

Jeff [10:07 AM]

 

Surprise: Only "HIPAA Covered Entities" are covered by HIPAA.  I think that's why they call them "covered entities."

A couple of points that should be cleared up: HIPAA doesn't apply, but other privacy laws might; if the data is financial, Gramm Leach Bliley would apply; state data laws might also apply, depending on what is in the data and the specific state laws.  And the FTC is certainly likely to be interested; just ask LabMD or Wyndham Hotels.  Also, as the story indicates, in each case when the data insecurity is brought to the company's attention, they fixed it.  Secondly, if you think genetic information is essentially the same thing as what's in your medical record, you don't know much about the practice of medicine (I guess lots of law professors don't know much about medicine).

Cranks.

PS: yes, I know HIPAA also covers business associates in certain matters.  But not in all matters, so I stand by my locution.

Jeff [9:52 AM]

[ Monday, November 16, 2015 ]

 

HIPAA Insurance: Do you have it?  Do you like your carrier?  Let me know (at jdrummond - at - jw - dot - com).  People occasionally ask me for recommendations, and my knowledge can be somewhat limited.

Jeff [11:13 AM]

[ Friday, November 13, 2015 ]

 

FTC Loses Big Data Breach Case: Of course, LabMD is dead from the weight of having to fight the FTC, but you gotta break some eggs to make an omelet, amirite?

LabMD had policies and procedures that were likely sufficient for HIPAA compliance, but an employee violated the policies and posted some P2P software on his company computer that allowed some data to be downloaded by others.  As far as can be proven, only one incident of downloading occurred - by a cybersecurity firm working in the P2P space.  Possibility of harm?  Yes.  Probability of harm? Er, no way.

Big H/T: Dissent Doe

UPDATE: I didn't notice until today that the decision was by an Administrative Law Judge, employed by the FTC itself.  That makes this even bigger news. 

Jeff [10:04 PM]

 

Top 10 Health Tech Hazards in Hospitals: Actually, most of these aren't hardware or software problems, but really human error (what we used to call "meatware"): failure to train, failure to have appropriate policies, failure to operate things correctly.  But it does indicate how technology can exacerbate problems or multiply the damage they can cause.

Jeff [11:12 AM]

 

Two Headlines:

6 Ways Big Data is Driving Personalized Medicine Revolution.

Healthcare Way Behind on Data Security, Cyber Firm Says.

Big Data, along with mHealth and Health Tech, are changing medicine dramatically, and mostly for the better.  But the benefits of those advances dramatically raise the risks (and potential costs, in dollars, health, safety, and life) of bad security.  

Jeff [9:51 AM]

 

State Agencies' Ability to Access Patient Records Without a Warrant: Interesting (if "insider baseball-y") case in California asking whether it violates the California constitution for the California Medical Board to access controlled substance records relating to specific individuals while investigating the individuals' physician.  A patient complained to the CMB about his physician; the CMB obtained that patient's records, as well as records of other patients of the same physician, from the California  Controlled Substance Utilization Review and Evaluation System (CURES), which is used to track down pill mills.  The CMB put the doctor on probation for failing to maintain sufficient records for the patient who complained, but also put him on probation for 2 other patients whose CURES records indicated they had been overprescribed.  The doctor sued the CMB, saying they have the right to access the records of the complaining patient, but accessing the records of the other patients violates those patients' right to privacy.  The AMA has joined the suit on behalf of the doctor.

Off the top of my head, I would say that the underlying answer is a state law question: does California law allow the CMB to access individual patient records without authorization from the specific patient while conducting a proper investigation the physician?  If so, HIPAA would allow it.  HIPAA allows HHS to look at an individual patient's medical records while investigating a hospital or physician for Medicare or Medicaid fraud, and I would suspect most state medical practice acts would allow the state medical board to have the same level of access while conducting legitimate board purposes, such as investigating a physician.  I suspect the California legislative and regulatory language must me more mushy.

The general rule, of course, is that PHI may be used or disclosed where legally required.  A similar case is playing out in Oregon, where the DEA attempted to access records of the Prescription Drug Monitoring Program using administrative subpoenas, and the PDMA refused, demanding that either a search warrant or court order must be presented for them to clear the HIPAA hurdle.  There's a federalism slant to that one, and it's police power (the DEA is like the cops) versus administrative power (CMB or CMS have power over California licensed doctors and Medicare/Medicaid providers, respectively), but the underlying question of whether and how the information in those types of databases can be accessed.  It certainly makes sense that they could be accessed when looking to take action against the physicians, but not the patients; however, is the patient's right to privacy big enough to prevent that different use?  Interesting question (although it really is insider baseball for the casual HIPAA observer).

Jeff [9:16 AM]

[ Wednesday, November 11, 2015 ]

 

Employer Not Liable for Employee's Bad Act: An Ohio court is dismissing a hospital from a lawsuit by a patient whose medical records (including an STD diagnosis) were posted on Facebook by a hospital employee.  The hospital, University of Cincinnati Medical Center, argued that the employee's acts were outside of her employment, so the hospital is not liable.

This case stands in contrast to the Hinchy v. Walgreens case, where a Walgreen's pharmacist looked at the medical records of her boyfriend's ex-girlfriend (looking for STDs, of course).  In that case, Walgreens was held liable.  Different states, different laws, different courts.  And it goes without saying that these cases are only arguably about HIPAA; they really are about the state law requirements in the two states, and about whether the deep-pocket employer has to pay the cost for the damage caused by the rogue employee.

Jeff [1:20 PM]

[ Tuesday, November 10, 2015 ]

 

The Cyber Risks of "Networked Medical Devices":  Most medical devices now capture, manipulate, and store data, but many also transmit it to other devices, EMRs, or directly to physicians, labs, clinics, or other providers.  These are great advances in science and medicine, but they also bring sometimes unanticipated risks.  One of these risks is the vulnerability of the data to hacking, which could include not only theft of the data, but revisions to it as well.

The OIG has included cybersecurity of networked medical devices on its 2016 Work Plan, which shows how important this issue is.



Jeff [2:13 PM]

 

Are Attorneys Entitled to the "HIPAA Rate"?  Interesting question.  I would say no (and at least Region III of OCR agrees), unless they are "smart shoppers" and have their clients request the record.  If the attorney is representing a party other than the patient, though, they'll just have to pay the higher rate. 

Jeff [1:59 PM]

[ Monday, November 02, 2015 ]

 

Medical Privacy Rights of Minors: Interested in learning more about how HIPAA impacts patients who are minors?  What can you tell the parents, and what can you keep from them?  I've got a seminar coming up on the topic if you're interested.  Click on the link for a discout.

Jeff [9:50 AM]

[ Tuesday, October 27, 2015 ]

 

Off Topic: This made me cry, @NaomiMartin.

I run.  I'm not a nut, but I run a lot, or at least I think most people would think it's a lot.  I ran the Dallas Marathon last year, this summer I ran a half marathon on the Isle of Skye with my daughter, another half marathon in the spring, and I'll run the Dallas marathon half this year.

But I hate running.  I'm not just saying that, I really hate running.  It's boring, and most of the year here in Dallas it's ridiculously hot.  But I like not being a fat blob (I'm just a little chunky instead), and running helps with that.  But it also helps when I'm depressed, which happens a lot more than I'd like to admit.  I've got a good life, but depression just happens sometime.  When I'm down, I'll sometimes note it on my running app after a run, and I can go back later and look at how running gets me out of depression.

When I'm not training for something (in other words, when it's just my daily run), I like to knock out a 5K every morning.  3.1 miles.  And I've got a great route: out my front door, west out of my neighborhood to the running trail (about .75 miles), south down the White Rock Trail running/biking trail for about 1.25 miles, then about 1.1 miles northeast to get back home.  Takes me about 30 minutes to make the run, another 10 or so to cool down, depending on how hot it is.  I can schedule around it.  In fact, here's a picture of my regular 5K run:

Maybe you can see where the trail goes under Walnut Hill.

I would've run that run on Monday, October 12.  I would've left my house a little before 8 am, passed under Walnut Hill between 8:10 and 8:15, been back home about 8:30.  But it was hot (73 degrees, 73 % humidity), and I knew I'd need at least 10 minutes to cool off.  I didn't have to get to the office until 10, and that timing would have been perfect: I'd have just enough time to get ready and would roll into the office right in time for my 10:00 am call.  But on October 12, Columbus Day, my youngest daughter Mary had the day off school, and wanted to go to the State Fair with friends.  The State Fair opens at 10 am, but I couldn't drop her off then since I needed to be at the office then.  So I told her I could take her, but would have to drop off her and her friend at Fair Park 9:45.  That did not leave me enough time for the whole 5K.  I only had 20 minutes, realistically, or about 2 miles.  So, this is what I ran:


At about 8:10 or 8:15 am on October 12, 2015, former Texas A&M wide receiver Thomas Johnson attacked David Stevens on the White Rock Trail, below the Walnut Hill bridge, with a machete, hacking him to death.  David Stevens, like me, was 53 years old that day.  On Sunday, Stevens' wife, Patti, unable to cope with the grief of losing the most important thing in her life, committed suicide.

Was Thomas' machete meant for me?  Was I supposed to be the 53-year-old victim?  Was I supposed to be on the scene not to be a victim, but to save Stevens?  Was it Mary's unreasonable demand to go to the State Fair, was it Big Tex that saved me?  Was it Ursuline Academy scheduling a day off for Columbus Day, was it Christopher Columbus himself that saved me?

I don't know, but hearing the heartbreak in Patti Stevens' voice in the SoundCloud clip at the bottom of the Naomi Martin piece . . . . . . makes me cry.

Jeff [11:44 PM]

 

Data De-Identification Carries Risk Under HIPAA: Interesting article on the risks of re-identification of de-identified data.  Two key points: as Deven McGraw points out, de-identification isn't intended to be a zero-risk proposition.  In fact, nothing in HIPAA is zero-risk.  Even permitted disclosures for treatment purposes can over-expose data.  The question is how low is the risk, what are the benefits, and do the benefits outweigh the risks.

Second point: spot the red herring.  87% of all Americans are uniquely identified if you know their date of birth, sex, and zip code.  Guess what?  If you add one more data point (social security number), 100% of all Americans are uniquely identified.  However, THAT AIN'T DE-IDENTIFIED DATA!  Under the HIPAA safe harbor for de-identification, you must remove date of birth and replace it with year (and, if the person is 90 or older, you can't even use year, just "90 or older").  And you must remove the last 3 digits of zip code (and if the remaining zip code contains fewer than 20,000 people, you have to remove the entire zip code and only use the state name).  How many Americans are uniquely identified by YEAR of birth, sex and FIRST 2 DIGITS of zip code (or state), Professor Sweeney?

Jeff [9:28 AM]

[ Thursday, October 22, 2015 ]

 

Must a BAA require the Business Associate to report unsuccessful Security Incidents?  Yes.

I bring this up because it's a recurring issue for me.  When negotiating BAAs, the BA often says, "We don't need to report unsuccessful Security Incidents; 'Pings' happen all the time and never cause any problem because they never get anywhere.  Asking us to report every ping is burden we can't possible take on."  You know what?  I agree.  HOWEVER, the rules don't.  Look at 45 CFR § 164.314(a)2)(i)(C): “The [business associate agreement ] must provide that the business associate will . . . report to the covered entity any security incident of which it becomes aware, including breaches . . . . “ (Emphasis mine.)  Security incident is defined in 45 CFR § 164.304 as follows: “Security Incident means the attempted or successful unauthorized access, use, disclosure, modification, or destruction or information or interference with system operations in an information system.”  (Emphasis mine.)  A “ping” is clearly an attempted unauthorized access, which means it is a “security incident;” and the BAA provisions say that the BAA must provide that the BA will report all “security incidents.”  The language clearly states that the BAA (or subcontractor BAA, which must meet the same requirements) must require the business associate (or subcontractor) to report “pings.”  In fact, stating that you need NOT report pings is directly contrary with the clear language of the regulations.

This is, obviously, a ridiculous requirement: pings are way too numerous and innocuous to make their reporting anything but a nuisance.  However, reporting them is explicitly called for in the HIPAA regulations.  Since reporting pings is required, I now include it in my BAAs, but minimize the reporting to the barest minimum to still comply with the regulations: a minimal number of reports (no more often than quarterly), with minimal information (a summary statement that “our network system regularly experiences 'pings,' port scans, and similar exploratory contacts, none of which result in a successful access to our system” would be sufficient), and only when requested (which likely will be never).   This complies with the requirements of the regulations but does not unnecessarily burden anyone.

You can also look at the OCR Frequently Asked Questions page.  Go here and search "Security Incident Procedures," and you'll get the answer to this question:

What does the Security Rule require a covered entity to do to comply with the Security Incidents Procedures standard?

The answer mainly deals with what a covered entity must do to respond or react to pings, but the final sentence is telling: "However, § 164.314(a)(2)(i)(C) and (b)(2)(iv) require contracts between a covered entity and a business associate, and plan documents of a group health plan, respectively, to include provisions that require business associates and plan sponsors to report to the covered entity any security incidents of which they become aware." There's that word "any" again. . . .

Jeff [11:26 AM]

[ Tuesday, October 20, 2015 ]

 

You Have No Privacy: A few years ago, I would've said this was fever swamp stuff, but after the actions of the current administration, particularly the weaponization of the IRS, I wouldn't put it past the government to dredge through your medical records for political purposes.  Sadly.  

Jeff [8:54 PM]

[ Tuesday, October 13, 2015 ]

 

Nine Cybersecurity Tips: This isn't a bad list, and it's easy to see how there's a lot of overlap between cybersecurity concerns/activities/foci and those needed for HIPAA risk analysis and safeguards.  Know where stuff is, control access and train users, add in protections, prepare for breaches, and cover the entire data lifecycle.  Hard to argue with those concepts.

Jeff [10:10 AM]

[ Friday, September 18, 2015 ]

 

HIPAA and Lawyers: Listen to me tell you what you need to know at this seminar.

Jeff [4:25 PM]

http://www.blogger.com/template-edit.g?blogID=3380636 Blogger: HIPAA Blog - Edit your Template