HIPAA Blog

[ Tuesday, January 14, 2020 ]

 

Buck, a HR/benefits consultancy, has just completed a survey of HIPAA compliance among company health plans, and the results are not surprising to those of us in the space.  Big problems with conducting risk assessments, ensuring business associate agreements are in place, regular employee training, and adopting and reviewing policies and procedures keep popping up.  There's a solid one half to two thirds that show good, consistent compliance; and this is employee health plans, not entities that are HIPAA covered entities by virtue of being in the healthcare business, so some slippage is to be expected (at least I hope the healthcare industry participants are better than this).  But given that compliance really isn't that hard, it's still distressing. 

Jeff [12:05 PM]

[ Thursday, January 02, 2020 ]

 

Sinai Health System in Chicago apparently suffered an email system compromise that exposed PHI of about 13,000 people.  Probably a phishing exercise that got through.  

Jeff [12:54 PM]

 

OCR has fined West Georgia Ambulance $65,000 for a breach involving a lost unencrypted laptop.  Of course, the real reason for the fine is that the company had failed to do a risk analysis and take other basic HIPAA hygiene steps (which, had they done so, might've led them to encrypt the laptop, which would have mooted this entire episode).

Of particular interest here is the relatively small size of the fine; I suspect that West Georgia couldn't afford more, so this probably stings pretty badly.  But that's the point, and I applaud OCR for the apparent reasonableness of the fine.  In my opinion, they should issue more smaller fines, rather than just a few big ones.  That's more likely to get people into compliance.  

Jeff [12:51 PM]

[ Friday, December 27, 2019 ]

 

This includes both healthcare and non-healthcare breaches, but it's . . . extensive.  More than just the wall of shame.

Jeff [2:42 PM]

[ Monday, December 23, 2019 ]

 

I'm cleaning out some old emails this morning, and don't think I posted these things previously:

Elite Dental: this Dallas dental practice responded to Yelp reviews in a way that exposed PHI.  The fact that the patient already posted, or that the PHI is already public knowledge, does not relieve the provider of his/her/its HIPAA obligations, and posting on Yelp, even truthfully and even if the original poster was lying, is still a HIPAA violation.  Thus, be very careful with your social media activities.

Korunda Medical: OCR's second fine for failure to respect the patient's right to access PHI.  The big problem for Korunda is that when first contacted, OCR provided them with assistance to fix the problem, but Korunda kept failing to transfer this patient's records.  This follows the Bayfront case back in September. Like Bayfront, the fine is small; access failures aren't an endemic problem, but in egregious cases they do deserve to be made into an example. 

Sentara: Failure to notify of a breach.





Jeff [8:48 AM]

[ Thursday, December 19, 2019 ]

 

Obviously, HIPAA and FERPA intersect: they are both privacy laws, but one applies to educational entities and the other to healthcare entities.  But, perhaps obviously, there's an overlap.  Well, both OCR and the Department of Education occasionally release joint guidelines about how to deal with that intersection, and they did so today.  You can view it here.  Nothing earth-shattering, but if you regularly deal with the intersection of medical records and educational records, you'll find this of interest.  

Jeff [9:38 PM]

[ Thursday, December 12, 2019 ]

 

November saw 29 HIPAA breaches affecting a little over half a million individuals, which is the lowest month this year.  Are we getting better, or just happenstance?  Are the outside threats focusing more on ransomware, due to its higher profit potential, or are health industry participants getting better?

Jeff [10:16 AM]

[ Monday, November 25, 2019 ]

 

HIPAA Fine for Lack of Prompt Access.  While cleaning out my email inbox I realized that I never blogged about this case.  Bayfront Health failed to grant a mother timely access to PHI about her unborn child (hmm, this is interesting -- I would've thought prenatal records would be the mother's records, not the child's . . . ), and in fact only provided the records 9 months after the request, rather than within the 30 days required by HIPAA.  

One of the rights HIPAA explicitly grants to individuals is the right to access their own data; in this case, the mother, as personal representative of her child, could exercise that right.  

The press release doesn't state why Bayfront was slow to provide access, but it couldn't have been too bad a reason, since the fine was only $85,000, which is pretty small by OCR standards.

Jeff [11:32 AM]

[ Thursday, November 21, 2019 ]

 

Ransomware.  It's not just the US health system under attack -- everyone can be hit with ransomware.  This time it was a French hospital.  

Jeff [8:40 AM]

[ Sunday, November 10, 2019 ]

 

Governmental Entities Aren't Immune from HIPAA Violations and Fines: OCR has just fined the Texas Health and Human Services Commission $1,600,000 because the Department of Aging and Disability Services failed to conduct an enterprise-wide risk analysis, which OCR believes would have prevented DADS from exporting data to a public server that, because of a software flaw, allowed the general public to see the PHI of about 7,000 people receiving services from DADS.  

Jeff [2:24 PM]

[ Wednesday, November 06, 2019 ]

 

URMC: University of Rochester Medical Center fined $3,000,000 for failure to encrypt a laptop that was stolen in 2017 and a flash drive that was lost in 2013.  That seems like an extreme fine, but there's more to the story.  In 2010, URMC also lost an unencrypted flash drive.  OCR did an investigation and, instead of fining them, gave them technical assistance, which undoubtedly included a plan to encrypt all portable devices.  Obviously, URMC didn't take the assistance and the encryption plan to heart.  The settlement agreement is here

Encryption is an addressable Security Rule standard, not a required one.  However, encryption is close to being an industry standard; if you aren't using it, at least for portable devices, you better have a good explanation of why.  Not just for the regulators, but for your constituents, your principals, and your patients: if URMC had encrypted that flash drive and laptop, they never would have to have reported the losses to OCR, there would have been no investigation, and there would have been no fine.  

Jeff [7:36 AM]

[ Saturday, October 26, 2019 ]

 

Jackson Health gets popped for $2.15 million.  OCR report is here.  

Jeff [2:43 PM]

[ Wednesday, October 23, 2019 ]

 

Looks like a phishing scheme attacking the email network at Kalispell Regional system in Montana might've exposed 130,000 patients' data.

Jeff [1:45 PM]

[ Wednesday, October 02, 2019 ]

 

Ransomware Attack on Alabama Hospital: DHC Health System facilities in Tuscaloosa, Northport and Fayette all diverting new patients while EMR and other systems were down. 

Jeff [8:32 AM]

[ Monday, September 23, 2019 ]

 

The following is a guest post by Liam Johnson, who is Editor-in-Chief of the website ComplianceHome.com.  Feel free to comment*, or discuss among yourselves.

*Comments are moderated and may not appear instantly.

Getting HIPAA Compliant in Google Cloud Platform

Is Google’s Cloud Platform HIPAA compliant? Likewise, is Google’s Cloud Platform ideal as an alternative to AWS and Azure for healthcare organizations? In this post, we are going to determine if Google’s Cloud Platform is HIPAA compliant, plus whether healthcare organizations can make use of it to host infrastructure, build applications and store files that contain protected health information.

Presently, the use of cloud platforms by healthcare organizations has increased tremendously, with the value of the healthcare cloud computing market being estimated to be $4.65 billion in 2016. This figure is expected to increase by 2022 to more than $14.76 billion.

Will Google Sign a Business Associate Agreement that covers its Cloud Platform?
The Omnibus Rule came into effect on September 2013, and ever since, Google started signing Business Associate Agreements (BAAs) with HIPAA covered entities for G-Suite. Consequently, Google expanded its BAA to include the Google Cloud Platform.

Currently, Google’s BAA covers majority of the cloud services such as Cloud Storage, Compute Engine, Cloud SQL for PostgreSQL, Cloud SQL for MySQL, Container Registry, Kubernetes Engine, BigQuery, Cloud Dataproc, Cloud Translation API, Cloud Pub/Sub, Cloud Bigtable, Cloud Dataflow, Stackdriver Logging, Cloud Speech API, Genomics, Cloud Machine Learning Engine, Cloud Datalab, Stackdriver Debugger, Stackdriver Trace, Stackdriver Error Reporting, Cloud Data Loss Prevention API, Cloud Natural Language, Cloud Load Balancing, Google App Engine, Cloud Vision API, Cloud Spanner and Cloud VPN.

In 2016, Google partnered with the backend mobile service provider Kinvey, subsequently leading to the availability of mBaaS on Google Cloud. Connectors to electronic health record systems that support healthcare apps are integrated into mBaaS.

Is the Google Cloud Platform HIPAA Complaint?
Since Google will sign a BAA with all HIPAA covered entities, does this mean that its Google Cloud Platform is HIPAA compliant?

HIPAA has one overarching requirement, and that is the BAA. It usually means that the data and security protection mechanisms of Google have been assessed and deemed to have surpassed the minimum requirement of the HIPAA Security Rule. Additionally, it means the cloud services Google offers meet the Privacy Rule requirements, and Google understands its responsibilities as HIPAA’s business associate. Thus, it agrees to offer HIPAA-compliant and secure infrastructure for the processing and storage of Personal Health Information (PHI).
Nevertheless, it is the mandate of the healthcare establishments to safeguard all the HIPAA rules when using the Google Cloud Platform is being followed. Likewise, they should ensure their cloud-based applications and infrastructure are configured and secured correctly.

The covered entities are given the duty to disable any Google services which the business associate agreement does not cover, control the set up to avoid accidental deletion of data, ensure access controls are implemented carefully, audit logs are checked regularly and all audit log export destinations are set. Moreover, care must be taken when uploading any PHI to the cloud to safeguard it is adequately secured, plus the PHI is not shared with unauthorized persons accidentally.

Jeff [4:34 PM]

 

A friend at Stroz Friedberg (a part of Aon) let me know a few months ago that they are seeing a particular uptick in ransomware affecting law practices, but really it's a problem across all industries.  As noted from the news out of Wyoming earlier today, ransomware is a particularly big problem in the healthcare industry. 

I thought I'd post through what my friend sent, with her permission.  But I'd also like to point out the "big 4" words of advice to prevent ransomware or minimize its impact.
1. Patch software regularly.  Most malware exploits a vulnerability that has been reported and for which a patch is available.  You can't always patch immediately, but do so on a regular basis.
2. Practice good backup management.  Having a perfect backup is the golden ticket for defeating ransomware: simply remove the encrypted content and replace with the backup.  Modern ransomware variants typically seek-and-encrypt backup files, as well as data files.  If your backup files are accessible on the same system, you could lose them too.  Multiple serial backup versions, stored offsite, will speed recovery and save you the ransom payment.
3. Map your systems and remove unnecessary connectivity. It's better if an isolated portion of your computing environment is encrypted and not the whole thing.  And you need to be able to find how the incident started to clean it up effectively.
4. Train and test your staff to recognize phishing attempts. The phishing attempt that isn't opened is the ransomware event you don't have and don't have to fix.

Anyway, here's the Aon report:

Ransomware
Everywhere


Over the past two weeks we have seen a significant uptick in ransomware attacks across all industries involving the Ryuk ransomware. The initial foothold is typically flagged as Emotet malware, and is usually delivered through a phishing email. The Emotet attacker then sells its deployment/footholds to a group using the Trickbot banking trojan. The "trick" refers to the various modules the malware can dynamically load to augment its abilities. It uses common vulnerabilities, such as EternalBlue, to spread rapidly throughout the victim’s environment. The Trickbot group then sells its wide access to a ransomware group, currently Ryuk (we have also observed Trickbot working with Bitpaymer). Once the Ryuk group gains access, they interactively move through the environment, spreading ransomware to encrypt files. They typically also go after backups in order to block recovery efforts, forcing the victim to pay the often sizeable ransom in order to restore mission-critical files and systems.

Mitigating Business Interruption

Clients should pay close attention to any anti-virus alerts from their endpoints, with particular sensitivity to alerts for Emotet/Trickbot since Ryuk or a similar ransomware is typically a fast follow to these.  In order to minimize the business impact of a ransomware infection, we recommend the following preventative measures:
§  Notify employees to be aware of suspicious emails.
§  Secure email platform account access - MFA, continual log review, etc.
§  Activate malware detection capabilities within mail gateways.
§  Remove the users’ ability to enable document macros.
§  Ensure AV is deployed to every machine and all alerts are being collected.
§  Follow-up on AV alerts.
§  Verify that network logs are being aggregated and reviewed for suspicious connections; Trickbot downloads its payload as a ".png" file.
§  Limit access and closely monitor admin and domain admin account usage.
§  Do not use shared local admin accounts and passwords across machines -- this is an easy way for Trickbot to spread.
§  Have a robust backup process for business critical servers and files such that back-ups occur regularly, are tested for efficacy, and are stored offline.

Getting Back to Business: Response and Recovery

§  Do not power down or reimage infected systems.  DO disconnect them from the network.
§  Preserve machines/logs and contact an IR provider.
§  Ensure the AV solution does not delete the accompanying "ransom notes" (usually .txt or .hta files) as these are typically used to store a unique code that is necessary to decrypt the files if payment is made.
§  Be on the lookout for other malicious software and persistence mechanisms as the Ryuk group may install their own malicious backdoors into the environment as their approach evolves.
§  Make a copy of online backups and store offline.  Alternatively, segregate online backups to prevent them from becoming encrypted or deleted by the attacker.
§  Do not discuss the ability or appetite to pay the ransom via email.


Jeff [4:19 PM]

 

Campbell County Health has been hit by a ransomware attack, resulting in diversion from the hospital's ER and cancelling of services.  

Jeff [3:08 PM]

[ Friday, September 20, 2019 ]

 

Privacy Rights of Minors.  Interested in knowing a little more about the privacy rights of minors?  Here's a webinar.  I might even be able to get you a discount. . . .

Jeff [8:22 AM]

[ Wednesday, August 28, 2019 ]

 

A concerted effort: Minnesota healthcare providers band together for cybersecurity protection.

Jeff [5:46 PM]

 

Mass General: apparently, some improper person got access to a Mass General research database containing information on 10,000 patients.  No SSN or other ID Theft treasure trove, but still a reportable HIPAA breach.  

Jeff [2:18 PM]

 

Presbyterian Healthcare (New Mexico) Data Breach: Looks like another email phishing attack, where an employee clicked a link on a phishing email and let an intruder into the hospital systems' data.  It does not look like EMRs were accessed, just email.  However, there can be a lot of PHI in an email system.

Jeff [2:14 PM]

[ Tuesday, August 27, 2019 ]

 

NY Dept of Health Issues Breach Notification Rules:  The Department has issued a letter to all licensed hospitals and other facilities outlining a new protocol that requires the facilities to notify the Department, along with other required notifications, of a potential cybersecurity incident.  So, in addition to OCR reporting (soon after the incident if it involves 500 or more persons, after year end for smaller breaches), reporting to affected individuals, and possibly reporting to credit reporting agencies and attorneys general, add a new recipient of the notice. 

Hmm.  OK, the Department argues that notifying them helps them spread the word and provide assistance to the victimized organization; that makes sense.

However, notification is required for cybersecurity incidents.  The notice says, "A cybersecurity incident is the attempted or successful unauthorized access, use, disclosure, modification, or destruction of data or interference with an information system operations."  Attempted?  That's problematic.  Must every port scan and firewall ping, which are "attempted" access to an information system, be reported?  That looks like the Security Incident definition in HIPAA, which is equally overbroad.

Hat tip: Jackson Lewis.

Jeff [11:23 AM]

[ Friday, August 23, 2019 ]

 

Part 2 Rule Changes Are Coming.  HHS issued a press release yesterday, along with a fact sheet, outlining some textual changes to 42 CFR Part 2.  The proposed rule was issued yesterday, but has not yet been published in the Federal Register; the rule will be open for comment for 60 days.  I was not able to locate a copy of the actual text of the rule; guess I'll have to wait for the Federal Register publication.

The Part 2 rules serve as a sort of "super-HIPAA" for federally-assisted substance abuse treatment centers.  Unlike HIPAA, which allows a HIPAA covered entity to disclose patient information for treatment, payment, or healthcare operations without the consent of the patient, Part 2 required the patient to specifically consent to the specific disclosure.  The rules currently indicate that the consent must indicate the exact recipient; stating that the recipients will be "healthcare providers involved in the patient's care" is insufficient, the Part 2 provider must say "Dr. Jones" or Dr. Smith."  The new rules will loosen up this requirements somewhat.

Part 2 also require that any disclosure contain an instruction to the recipient that the information remains subject to Part 2 and cannot be further disclosed.  HIPAA, on the other hand, operates under a "horse is out of the barn" structure, where once a disclosure is made to a non-HIPAA-covered entity, it may be further disclosed.  This concept in Part 2 isn't really changing, except that it's now clear that if a non-Part 2 provider develops medical records relating to a Part 2 patient that includes information from the Part 2 provider, as long as the non-Part 2 provider keeps the records separate, the Part 2 records don't subject all of the non-Part 2 provider's records to Part 2 restrictions. 

The new rules also expand the "emergency" exception to disclosure of Part 2 medical records.  As originally drafted, it was a medical emergency involving the patient that triggered the exception; now, if there's a declared emergency (like a hurricane), disclosure restrictions are loosened.  Other changes involve the ability of providers to disclose information to state prescription drug monitoring plans and looser rules for disclosures for research.

The AP article has a pretty good explanation of the impetus for the revisions, including an explanation of why Part 2 exists in the first place.  These changes are really nibbling at the edges and fixing specific issues.

Jeff [12:01 PM]

[ Monday, August 12, 2019 ]

 

Interesting article this morning out of Pennsylvania.  A patient has sued Lehigh Valley Memorial Hospital Network (LVHN, which is not LVMH, the luxury brand aggregator), alleging that a doctor on the staff who was not treating him, but with whom he had a business dispute, improperly accessed his medical records.  He's suing the hospital for failing to prevent the doctor from accessing his records.

This raises a number of issues and possible teaching points.

Access Restriction is Required Hospitals do have an obligation to restrict access to PHI to only those persons with a need to access it.  Sometimes this is easy -- an orderly or a maintenance worker shouldn't have access to PHI.  But sometimes it's tricky; a nurse should only have access to PHI of patients he/she sees and treats, but if the hospital prohibits access to patients' PHI other than those assigned to the nurse, and there's an emergency in another department and the nurse must fill in there, the nurse might not be able to access necessary PHI and the patient's health might suffer.  Likewise, doctors on staff should only access the PHI of their patients, but sometimes an emergency consult might be necessary.  A pediatrician would probably never provide care to a geriatric patient, but in many cases lines aren't easy to draw.

Thus, providers must consider whether they can restrict access up front via hard-wired solutions like permitting access only to a set list of patients (or classes of patients).  Often times, they can't, so they then need to set up some other sort of solution.  Usually, this involved a two-part solution: first, the parties seeking access (workforce members like nurses and schedulers, as well as non-employees such as staff physicians at a hospital) must be instructed and trained to only access the PHI of their own patients and never access PHI for which they don't have a permitted need (usually treatment, but possibly payment for accounts receivable or finance employees, and healthcare operations for QA/UR staff).  Secondly, the hospital or clinic then needs to have some mechanism to make sure people are doing what they are supposed to be doing, and not improperly accessing PHI.  This may involve random checks, regular checks, or the use of artificial intelligence or machine learning algorithms to identify potential problem access issues.  The hospital or clinic should then follow up with those whose access seems excessive, and determine if there is a legitimate need.  If not, they need to take follow-up actions with the access abusers -- more training, restricted access, or some sanction, up to and including termination for abusive snoopers.

In this case, the hospital may have been doing the right thing; many hospitals need to allow open access to all physician staff members, and if the hospital had proper training up front and post-access audit controls, it's not impossible that this improper access might have slipped through the cracks.  On the other hand, if the hospital did not train its employees, did not have policies in place regarding access by staff physicians, and did not reasonably audit to look for abusers and fix improper access problems, it may have violated HIPAA Privacy Rule requirement to restrict access.  If the access was to an electronic medical record, the hospital might also have violated the HIPAA Security Rule.

Improper Access May Be a Breach.  Once the hospital knew that the access was improper, it then knew there was a "breach of unsecured PHI," and then had an obligation to notify the patient.  If it did not do so without unreasonable delay (and in all cases withing 60 days of knowing of the breach), it violated the HIPAA Breach Notification Rule.

The doctor accused of improper access might also be liable here.  He apparently claims that he had a patient-provider relationship with the patient, in which case his access to the PHI might have been proper.  Even if he had a patient-provider relationship, that does not give him carte blanche to access the patient's PHI -- the access must still be for a permitted purpose such as treatment or payment (and if it's for payment, it must be limited to the reasonably necessary amount).

Don't Disclose PHI to the Press, Even if it is Already Disclosed I'd also note that both the hospital and the physician have (appropriately) not commented to the press on the matter, but their comments (acknowledging the patient was a patient is, in itself, a disclosure of PHI) were taken out of court filings; generally, disclosing PHI in a court record, where the disclosure is relevant to the litigation, is a permitted disclosure; it appears that the reporter pieced the case together from the court records.  The fact that the PHI is already out in the public record is irrelevant -- just ask Memorial Hermann in Houston. 

Even Unidentified Information Can Sometimes Be Used to Identify Someone It's not central to this particular story, but another interesting point here is that this case shows how de-identifying information is sometimes ineffective, if there are other sources of information that might be leveraged to cross-check and add in identifiers.  The Health Department didn't say who the patient was, but included the date of discharge, which the reporter was able to connect to the court filings.  It's not absolutely certain that the specific patient mentioned in the Health Department report is the plaintiff patient in the lawsuit, but it's pretty likely.

In Litigation, a QPO is Always an Option.  Often, when PHI is used in litigation, the individual who is subject of the PHI will seek to prevent his/her PHI from being in a public record, in order to keep his personal medical issues private.  This can be done with a Qualified Protective Order or QPO, as specifically mentioned in the HIPAA regulations relating to information disclosed subject to a subpoena.  Here, the information in the legal proceeding actually ended up being used by the press to the detriment of the hospital and physician.  I'm guessing that LVHN, and possibly Dr. Chung, are wishing they had used a QPO to protect some of that PHI.



Jeff [1:06 PM]

[ Friday, July 19, 2019 ]

 

As you should know, while HIPAA has pretty strict rules for most covered entities, those that provide services in the substance abuse arena are often subject to even more strict rules. Called the "Part 2 Rules" since they come from 42 CFR Part 2, they basically prohibit the disclosure of patient information by federally-supported substance abuse centers unless the patient gives specific consent for the particular disclosure. 

As with any privacy rules, the stricter the rules, the worse the utility of the data.  And in the substance abuse arena, allowing patient privacy serves a great good (patients won't be afraid to seek care due to the fear their addiction will be disclosed), but that same privacy can prevent programs from providing the help that patients need.  This can be particularly troubling in the face of the opioid epidemic.

HHS is proposing to revise those rules somewhat to allow better sharing of data between providers.  It will be interesting to see how it plays out; parties from both sides will be likely to weigh in.

Jeff [3:39 PM]

[ Wednesday, July 03, 2019 ]

 

Here are 6 things small providers could do better to get to better cybersecurity compliance.

Jeff [6:20 PM]

[ Monday, July 01, 2019 ]

 

Those "obsessed" with privacy (hey, obsession is in the eye of the beholder; one person's reasonable caution is another's obsession) know that if a digital service is "free," the service isn't the product to you; you are the service to the product's developer.  Google Maps is a free product, so it seems; actually, though, you are the product: when you use Google Maps, Google gathers data on you that is uses to sell other products to its customers.  If you don't care about privacy, it's a great deal: you give up privacy you don't want for a free map program.

But if you do care about privacy, what should you do?  Find less intrusive programs.  Some are free but some cost money, but that's what you have to do if you want to protect yourself. 

Anyway, if you're looking for alternatives to the non-private Google products, here's a list.  Think about it. . . . 

Jeff [2:42 PM]

[ Thursday, June 27, 2019 ]

 

OCR has published 2 new FAQs relating to when and how health plans may share PHI for care coordination with other plans serving the same individuals.  The first question actually alludes to one of the tricky elements of uses/disclosures that are for "health care operations" of a different covered entity: not all "operations" elements are acceptable in those situations.  Care coordination is one of the acceptable elements, though, so that's good.  The second question delves into when an entity can use PHI that it received for a different purpose to tell the individual about other products and services, without the communication being "marketing" (in which case the individual must authorize that use/disclosure).  

Jeff [10:14 AM]

[ Monday, June 24, 2019 ]

 

Two New York (Southern Tier, Allegany area) health care providers were hit by Ransomware last week.  No word on how the attacks occurred, but I'd guess both started with email phishing schemes.

Jeff [11:36 AM]

[ Wednesday, June 05, 2019 ]

 

Now, it's LabCorp.  Just days after Quest announces a breach of 12 million patients, LabCorp announces a 7 million patient breach of its own.  Well, not really it's own: like the Quest breach, LabCorp is announcing a breach of its billing vendor, who is the same billing vendor that Quest uses.  

Jeff [2:14 PM]

[ Monday, June 03, 2019 ]

 

Quest Diagnostics announces a big breach: it looks like a billing vendor, AMCA, suffered the breach, which appears to be a phishing-based email access hack.  It does not look like lab test results were accessed, but billing and financial information (which is still PHI, and would also include some indicia of what medical issues the data subject might have, due to an indication of what tests were ordered and conducted).

Jeff [3:03 PM]

[ Thursday, May 30, 2019 ]

 


MIE breach brings state fines as well: Yesterday my favorite HIPAA/Privacy reporter tipped me off to the fact that MIE also got fined by state regulators.  MIE is an Indiana-based medical records company, and its clients are spread across the Midwest and elsewhere.  In addition to the $100,000 fine to OCR, MIE also paid $900,000 to a total of 16 states (Arizona; Arkansas; Connecticut; Florida; Indiana; Iowa; Kansas; Kentucky; Louisiana; Michigan; Minnesota; Nebraska; North Carolina; Tennessee; West Virginia; and Wisconsin) to settle HIPAA and state law breaches.

This is a good reminder: you can't only look at HIPAA to determine your obligations to protect data and report breaches; you also must look at state laws. Specifically, all states have data breach reporting laws, and most have either personal data protection/security laws or general "deceptive trade practices" laws that contain a privacy component.  Thus, your data security activities must be HIPAA compliant and state-law compliant, and if you suffer a breach, you must look at both the applicable state laws as well as HIPAA to determine your reporting obligations (some breaches require reporting under HIPAA only, some under state law only, and some under both).

Additionally, since the HITECH Act, OCR isn't the only show in town as far as HIPAA enforcement specifically.  Even if OCR does not fine an entity, a state can do so specifically for a HIPAA violation, but not for a state law violation

In MIE's state law case, MIE paid OCR for violating HIPAA but also paid the 16 states for violations of HIPAA and state laws (i.e., not just state laws).  But, it was an agreed order, so it's hard to tell what would've happened if MIE objected to the fact that, since OCR had already fined them, they should not have state law liability under HIPAA.  I assume the states would've dropped the HIPAA part and relied on state law exclusively.

The final lesson: there are multiple regulators.  Don't forget that.

Jeff [11:29 AM]

 

Recent OCR activity: Touchstone and Medical Informatics Engineering: If you've been watching the news, you'd have seen a couple of recent HIPAA enforcement actions, with some striking differences.  

First, as I mentioned below, Touchstone Imaging got tagged for $3,000,000 for a server issue that left FTP files exposed to anyone searching the internet.  Then, shortly thereafter, business associate MIE got tagged with a $100,000 fine because a hacker got access to their patient files. Why the big difference?  I'll discuss in a later post. . . . 


Jeff [10:58 AM]

[ Friday, May 17, 2019 ]

 

The plaintiff's complaint in a lawsuit is only one side of the story, but given that the hospital fired the tech, it's plausible.  Covered entities' greatest risk for a HIPAA violation these days comes from rogue employees.  Whether it's employees stealing credit card information to pay their rent or selling the data to personal injury lawyers, or just not securing data (or losing phones and laptops), a bad employee can cause serious HIPAA damage.

Jeff [11:28 AM]

[ Tuesday, May 14, 2019 ]

 

Nor should it be. That's not how this works.  

Jeff [5:43 AM]

[ Monday, May 13, 2019 ]

 

We've heard it over and over, the healthcare industry is the biggest target for data breaches, given the overall value of data, plus the large number of targets with, shall we say, less than stellar defenses.  Here's proof that those indications are right: healthcare leads in total data breaches and total data breached.  

Jeff [11:52 AM]

 

Anthem Update: As many security folks noted, the big Anthem breach seemed to have a "state actor" flavor to it, and most thought the fingers pointed to China.  Well, 2 Chinese nationals have been charged with involvement, which seems like the likely next step . . . 

Jeff [11:49 AM]

[ Monday, May 06, 2019 ]

 

Unprotected FTP servers can cause problems, since whomever finds them on the internet can access the data in them.  They aren't easy to find, but they can be.  Of course, when your initial response is that there was no PHI disclosed, when in fact 300,000 people had their PHI exposed, you should expect a fine.  

Jeff [8:49 PM]

[ Monday, April 29, 2019 ]

 

I haven't seen the actual proposed regulatory text yet, but Modern Healthcare is reporting that OCR will lower the maximum fine level for organizations that violate HIPAA, depending on the organization's level of culpability.  Obviously, OCR could have exercised prosecutorial discretion in levying fines, but it can't hurt to encourage organizations to lower their culpability level.

Jeff [8:37 AM]

[ Wednesday, April 24, 2019 ]

 

Brookside ENT and Hearing Center in Battle Creek, Michigan got hit by a ransomware attack.  They didn't pay the hackers, their medical records were lost, and they have gone out of business.  The two partners have gone into early retirement.

So, Ransomware can kill you.

Jeff [5:47 PM]

[ Monday, April 08, 2019 ]

 

in healthcare hacks, at least.

Jeff [1:20 PM]

[ Friday, March 22, 2019 ]

 

This is a big one.  It also not an OCR settlement, but rather a settlement of a class action lawsuit by affected individuals.  Class action cases are hard to bring, but they got a big settlement here.

Jeff [1:59 PM]

[ Thursday, March 21, 2019 ]

 

Bad server migration exposed the files.  One of those ftp server issues @JShafer817 is always talking about?

Jeff [7:06 AM]

[ Monday, March 18, 2019 ]

 

According to reports from OCR.  Email hijacking and ransomware are the leading trouble-makers.  

Jeff [12:46 PM]

 

Cyber Risk Assessments or Security Risk Assessments ("SRAs") are pretty common in the privacy universe.  In fact, doing some form of an SRA (and regularly repeating/updating) is a required activity for any HIPAA covered entity or business associate.  How do you know what types of safeguards are reasonable and appropriate for your business if you don't understand what your risks are?  However, before you go off and do one, here are 5 questions you should ask.  (One note: I'd add HITRUST to the "frameworks" listed in question 2.)

Jeff [12:43 PM]

[ Monday, March 04, 2019 ]

 

More information here.

Jeff [7:47 AM]

[ Saturday, March 02, 2019 ]

 

About the Wolverine breach: my take is in the comments here.

Jeff [1:06 PM]

[ Wednesday, February 27, 2019 ]

 

Information here.  Looks like your garden-variety email-access phishing attack.

Jeff [10:36 AM]

[ Wednesday, February 20, 2019 ]

 

Looks like an email-access phishing attack.  A good reminder not to keep PHI in emails, either in the emails themselves or in attachments.  Or encrypt everything at rest.  

Jeff [10:14 AM]

[ Monday, February 04, 2019 ]

 

Interesting article.

Jeff [10:51 AM]

http://www.blogger.com/template-edit.g?blogID=3380636 Blogger: HIPAA Blog - Edit your Template