[ Monday, August 22, 2016 ]
OCR to investigate smaller breaches. This makes sense
Jeff [2:14 PM]
if they want to look at entities with lots of small breaches, breaches involving the exact same fact scenario, or breaches that cause a lot of damage even though there are only a relative few victims (i.e., less than 500 affected individuals). Timing of notifications matters: OCR will find out that a big breach has occurred when the individuals find out, but won't hear about small breaches until January-February of the next year. And OCR will investigate small breaches if there's a complaint, but not necessarily if there's not.<500 500-victim="" a="" against="" always="" and="" are="" as="" associates="" at="" be="" because="" believe="" breach.="" breach="" breaches="" business="" but="" clear="" complaint.="" conduct="" consumer="" covered="" cursory="" discretion="" easy="" egregious="" end="" entities="" every="" filed="" fine="" good="" guarantee="" had="" has="" how="" however="" i="" idea="" into="" investigate="" investigation="" is="" it="" least="" level="" lulled="" make="" matter="" nbsp="" no="" not="" o="" ocr="" of="" p="" potential="" publicize="" reported="" respect="" s="" security="" sense="" small="" smaller="" sneaking="" somehow="" still="" sure="" that="" the="" they="" this="" those="" to="" under="" until="" which="" with="" year.="">500>
However, this initiative really only makes sense if OCR has extra investigator time on their hands, which I'd guess they don't. Thus, what's the real rationale for a public announcement of this kind? Probably to keep people on their toes. If someone thinks they're in the clear and able to fly under the radar when the breach is less than 500 people, maybe this is intended to give them a little fear-factor and make them think twice, at least about doing a good breach risk analysis and maintaining good documentation.
PS: an earlier version of this post was garbled because I used the "less than" sign rather than the words, which triggered a weird HTML effect. Thanks to Theresa Defino for the heads up.
[ Thursday, August 11, 2016 ]
Jeff [12:25 PM]
Just because you're a healthcare provider does not mean HIPAA is applicable to you.
I was having a conversation just last night regarding this issue: HIPAA only applies to health plans, health care clearinghouses, and health care providers "who transmit any health information in electronic form in connection with a transaction covered by" HIPAA. The 8 HIPAA-covered transactions are:
- Health claims and equivalent encounter information.
- Enrollment and disenrollment in a health plan.
- Eligibility for a health plan.
- Health care payment and remittance advice.
- Health plan premium payments.
- Health claim status.
- Referral certification and authorization.
- Coordination of benefits.
If you are a health plan but don't undertake any of the above transactions in electronic form, then you are not covered by HIPAA. That does not mean you are entirely in the clear.
If you suffer a breach, you may have state law reporting obligations you must still clear. And if you serve as a business associate for a covered entity, you may become subject to HIPAA via that back-door route. However, the potential for big HIPAA fines are not there if you are not a HIPAA covered entity.
This was illustrated by a New Jersey case last year
, which I also blogged about
(albeit in a different, more esoteric context).
[ Monday, August 08, 2016 ]
Jeff [5:57 PM]
Are Ransomware Attacks Per Se HIPAA breaches?
"Not Necessarily," says this National Law Review article.
Of course, I agree. But this is just plain wrong: "If, however, the ePHI is encrypted by the ransomware attack, a breach has occurred because the ePHI encrypted by the ransomware was acquired (i.e., unauthorized individuals have taken possession or control of the information), and thus is a “disclosure” not permitted under the HIPAA Privacy Rule." In most ransomware situations, the malware is injected into the affected system; there is no possession, and certainly no disclosure; there is only "control" in the context of preventing the rightful owner from controlling the data, since the hacker has no control either, and can't even decrypt the data. Preventing someone else from using their data is not "controlling" the data, it's controlling the victim and rightful owner of the data.
Jeff [10:11 AM]
Newkirk, BCBS-KS breaches: Newkirk
is a business associate of a lot of health plans, printing insurance cards for plan members (not too sure what happened there, since the article is behind the WSJ paywall). Blue Cross Blue Shield of Kansas
is one of Newkirk's customers, apparently, and about 800,000 of their customers are impacted. No SSNs or financial information, but insurance information like group numbers and the like, which would be helpful for medical identity theft.
Yes, Healthcare Data is Attractive to Hackers: For a number of reasons
Jeff [10:03 AM]
, as reflected in the value of health information on the "Dark Web." But is the healthcare industry reacting appropriately and increasing defenses? There sure seem to be a lot of breaches being reported, but don't mix in the settlements of old cases with new breaches. In fact, so far, 2016 is experiencing substantially fewer people affected by healthcare breaches. Maybe we are moving in the right direction. . . .
Yes, it is a Big Year for HIPAA Fines:
Jeff [9:58 AM]
but is it proof of more enforcement
(or more strict enforcement), or just bigger fines? Personally, I've had several clients avoid fines where I thought OCR would levy something, but that might be my expectations changing, not the underlying enforcement environment. (For the record, none of those clients deserved a fine, nor could they really afford one, but given the current enforcement trend, I was worried.)
[ Friday, August 05, 2016 ]
Jeff [11:32 AM]
[ Thursday, August 04, 2016 ]
Biggest Fine Yet (IIRC):
Jeff [4:42 PM]
Illinois' Advocate Health has been fined $5.55 million
by OCR for a series of HIPAA failings. Looks like a lack of a good risk assessment, lack of physical access controls, and BAA failures are part of the mix.
[ Wednesday, August 03, 2016 ]
It's a Banner Day for Breaches. Banner Health suffers a huge one
Jeff [1:54 PM]
: 3.7 million patients. Actually, it looks like 2 breaches in one for the huge western-US healthcare provider. One went after payment card data from food and drink locations at Banner facilities, and the second one went after patient records.
Hacker World Problems:
Jeff [1:54 PM]
a Ukrainian hacker stole 100,000 documents
from Central Ohio Urology Group (mostly internal documents, like surgery schedule spreadsheets) and posted them online.
Was he trying to sell the data on the Dark Web? Engaging in identity theft? Extorting payments from the group?
No, he's trying to bring public awareness to the "fact" that the Pentagon is poisoning people in the Caucasus with secret injections.
on the story.
[ Monday, July 25, 2016 ]
Medical Device Security:
Jeff [10:04 AM]
I still think this is in the realm of TV shows and movies (I've been binge-watching Mr. Robot lately), but while the likelihood is slim, the possibility of hacking a medical device should certainly concern the healthcare IT crowd.
Here's an interesting graphic I got from Arxan Technologies
that is certainly food for thought.
[ Friday, July 22, 2016 ]
No, No, No.
Jeff [10:38 AM]
No, @HealthPrivacy, you cannot draft regulations via guidance. This is just plain wrong.
If a covered entity has, in the course of a reasonable risk analysis, determined that emailing of unencrypted PHI is not secure, then the covered entity is not required to email unencrypted PHI to individuals exercising their access rights. The regulations do not say that, and you can't change the regulations by issuing guidance. If the covered entity has no such policy, or if it allows unencrypted emailing in other situations, if it has the policy but doesn't follow it, or if the policy is unreasonable, then the covered entity may
have to email PHI to the patient. The access regulations (which carry the force of law) say that, if the covered entity maintains the PHI electronically, then it must provide the PHI in electronic format; they do not
say that the covered entity must provide the PHI via electronic transmission.
Follow the rules, OCR. You can certainly change the regulations. If this is important enough for guidance, it's important enough for a regulation. Propose a new rule revising 45 CFR 164.524, publish it, request/receive/review public comments, and finalize it. That is how it works.
And don't try to enforce "guidance
" as if it's a law or regulation. It's not.
[ Thursday, July 21, 2016 ]
Ransomeware: 4 steps for fighting it.
Jeff [11:13 AM]
I'd add my own 4 steps, if I haven't already:
- Patch management and current virus software: whenever vulnerabilities are discovered in software, the developers usually send out patches. Make sure your organization is signed up to get those patches and promptly applies them. It's extremely unlikely you'll be attacked between the time the vulnerability is discovered and the time the patch has been provided; usually, however, businesses don't apply the patches, or don't sign up to get them, and it's a relatively old vulnerability (for which a patch is available) that is ultimately exploited. Same with virus protection software.
- Limit connectivity. Computers that aren't connected to the internet can't get infected by the internet, at least not directly. Don't connect computers unless you have to, and if you do, make sure your connectivity architecture is simple, logical, and traceable. If there's only one gate into the city, there's only one place to focus your protection efforts.
- Have good backups. Ransomware is designed to scramble your eggs. If you can just throw those eggs out and replace them, then you won't need to pay the ransom. Dealing with a ransomware attack is still enough hassle that you want to take all other other steps, but worse case scenario, good backups thwart any ransomware attack. Delete the infected files, scrub the system, and reinsert the backups.
- Train your staff and be prepared. Most ransomware comes from phishing or other social engineering. Most attacks are pretty clumsy, too, if you have the slightest clue what to look for. Make sure you staff has the slightest clue; better yet, make sure they have some pretty good clues. And make sure your organization is ready for any hack, whether it's ransomware, DDoS, or date theft. Who ya gonna call (when something looks funny in the system)? If your team doesn't know the answer, you aren't ready.
Breaking News: Entities not covered by HIPAA have privacy and security gaps
Jeff [10:40 AM]
. Well, duh.
HIPAA isn't intended to be some European-style data rights law that grants everyone specific rights in their own data and the right to demand that third parties, with which they may have no direct relationship and which otherwise owe them no specific duties, either limit their uses/disclosures of that data or provide minimum levels of security and protection to that data. Frankly, that's not how the data rights structure of American law works, and not how it should work. Have you seen what lawyers have done with the Illinois biometric privacy law so far? Imagine what they would do if every person entity who might legitimately come across personal information had a duty to protect it? Consider this: if you have a phone book in your house and it's not locked up, you aren't protecting the identifiable information in it; if there was a law applicable to you that required you to protect it, anyone whose name is in that phone book could sue you. That's crazy; and that's why you have no general obligation to protect that data, and only have an obligation if there's some specific contractual or other relationship, duty, or applicable law.
So it's understandable that, while HIPAA requires certain restrictions and levels of protection from covered entities (and, both directly and indirectly, from business associates), it doesn't require the same level from "non-covered entities."
Update: Here's another article
, and here's a copy
of the HHS report on NCEs.
[ Wednesday, July 20, 2016 ]
I think we knew this: cyber attacks increasing in the health care industry
Jeff [6:02 PM]
. Interesting take on the article: the ACA pushed medical practices to adopt EMRs before they were technologically proficient enough, and now cyber attacks are the price we pay for not really being shovel-ready.
I call bullshit. Plenty of tech-savvy companies have been hacked. It's not a "not ready for prime time" issue of the targets. If they were more ready, they'd still be getting hacked.
[ Tuesday, July 19, 2016 ]
Providence Health, Oregon:
Jeff [3:44 PM]
A bad employee apparently snooped
on 5,400 patients' demographic info, including SSNs. But Providence doesn't think the employee kept or used the information. Not sure this is necessarily a reportable breach (or breech), but perhaps they were just notifying out of an abundance of caution?
[ Friday, July 15, 2016 ]
Oregon Health & Science University:
Jeff [2:35 PM]
I reported back in 2013
regarding OHSU's multiple breach incidents. It seems OCR has finished its investigation and levied fines of $2.7 million
for the breaches. That's a lot of cash when there was no harm done to patients. . . .
[ Thursday, July 14, 2016 ]
Jeff [10:42 AM]
[ Wednesday, July 13, 2016 ]
Jeff [5:13 PM]
[ Tuesday, July 12, 2016 ]
Cybersecurity made somewhat simple: a podcast from Tech Policy.
Jeff [4:26 PM]
Obviously there's more than this, but it's a good place to start thinking about some of the low-hanging cybersecurity fruit.
OCR Issues Ransomware Guidance:
Jeff [12:24 PM]
While I couldn't disagree more with the assertion that ransomware attacks "usually" result in a Breach, I do applaud OCR for issuing this timely and pertinent guidance
to covered entities. Clearly, regardless of the specifics of your business, you should take these steps to help prevent or minimize the impact of a ransomware attack:
- Do a risk analysis and implement the recommendations it produces
- Have good virus protection
- Be active with patch management
- Train your staff to avoid phishing attacks
- Limit access to sensitive data to appropriate individuals
- Limit access to sensitive data to appropriate apps and software
- Limit connectivity (if a computer does not need to access the internet, cut it off)
- Have good, thoughtful, and thorough data backup strategies
Also a good idea to have a security incident response plan (including a staffed incident response team) in place and ready to respond.
Jeff [9:23 AM]
[ Thursday, July 07, 2016 ]
Jeff [11:15 AM]
[ Thursday, June 30, 2016 ]
Mass General Dental Data Breach:
Jeff [12:52 PM]
a dental vendor to the hospital, Patterson Dental Supply, suffered a breach
of its servers hosting the Mass General dental patient data.
[ Wednesday, June 29, 2016 ]
Jamie Knapp: Analysis Update:
Jeff [3:24 PM]
A couple of folks (@LaClason and @PogoWasRight) pointed out that, in regard to my earlier post
this morning, HITECH did add a change to the actual HIPAA statute that is intended to be used (and has been used) to prosecute employees or third parties for acts that would be violations if they were covered entities, mainly to avoid the anomaly that rogue employees or other bad actors are free from HIPAA criminal liabilities because they aren't the actual covered entity.
Prior to HITECH, Section 1320d-6(a) had one sentence, that says: "A person who knowingly and in violation of this part (1) uses or causes to be used a unique health identifier; (2) obtains individually identifiable health information relating to an individual; or (3) discloses individually identifiable health information to another person, shall be punished as provided in subsection (b) of this section." HITECH added a second sentence: "For purposes of the previous sentence, a person (including an employee or other individual) shall be considered to have obtained or disclosed individually identifiable health information in violation of this part if the information is maintained by a covered entity (as defined in the HIPAA privacy regulation described in section 1320d–9 (b)(3) of this title) and the individual obtained or disclosed such information without authorization." The copy of 42 USC 1320d-6 that I pulled up online didn't have the added language, which explains my miss of it.
However, it did give me an opportunity to re-review the new statutory language, and in fact I maintain my opinion: Knapp (and Chelsea Stewart in an earlier case) should not have been convicted, because their acts were not in violation of HIPAA. That's because the HITECH-added language, which is intended to make them criminally liable (and pursuant to which they were held criminally liable), is deficient from a statutory construction standpoint.
The added language says “for purposes of the previous sentence,” which would be fine to change something within the construct of the previous sentence. (Example: "It is a violation of fashion law to wear white after Labor Day. For purposes of the preceding sentence, white shall include bone, ecru, ivory, eggshell, and taupe.") But the preceding sentence still says the obtaining or disclosing must be “in violation of this part.” It doesn’t change the definition of a covered entity or put obligations onto anyone other than a covered entity.
And you can’t change the meaning of “in violation of this part” by such a passing reference. In other words, you can’t change the definition of “in violation of this part” to simply mean any obtaining or disclosing of IIHI “if the information is maintained by a covered entity . . . and the individual obtained or disclosed such information without authorization.” If that’s the case, then any obtaining or disclosing of IIHI that is (i) “maintained by a covered entity” and (ii) “without authorization” would be a violation. And if that’s the case, every obtaining or disclosing of hospital-held PHI for treatment, payment, or healthcare operations (i.e., uses and disclosures for which an authorization is not required) would be a HIPAA violation.
HITECH was a hastily- and sloppily-written statute. But it’s also another example of the pure lawlessness of the current federal government. If we are to live under the rule of law, laws must apply equally to all. They must be clearly written so citizens can know exactly what conduct is prohibited and what is allowed. Words have meaning, and the meaning of words has consequences. When it comes to criminal law, where one’s property or liberty can be removed by the state, there cannot be a “well, you know what I mean” quality to it. Criminal statutes in particular MUST be clearly and precisely written. If there is any ambiguity (and there certainly is here), the benefit of the doubt must go to the accused.
Congress had the opportunity to fix this loophole by changing the definition of Covered Entity or by specifying a new and separate violation (i.e., “a person violates this part if . . . “ or “It is a violation of this part if a person . . . “), but they didn’t do so.
I hope the next person who is charged under this provision challenges it on these grounds. I don’t object at all to holding employees and other non-covered-entities criminally liable for these types of breaches. I think this is a loophole that should be and needs to be closed. But the law should be written to make these types of breaches actual violations of the law, and what is written doesn't do that. Have some respect for the rule of law.
Jamie Knapp: another HIPAA criminal conviction:
Jeff [8:32 AM]
a respiratory therapist who accessed PHI of patients she was not seeing has been convicted
, apparently of violating HIPAA, by an Ohio federal jury. I'm still trying to figure out how a respiratory therapist employee of a hospital, who by herself is not a covered entity, was convicted of violating HIPAA. Not every health care provider is a covered entity; you must also conduct electronic transactions that are HIPAA regulated. Generally, an employee will not be conducting those transactions. And while the officers and directors of a company may be held liable for their activities as decision-makers of their companies (in other words, they can't hide behind the company for their own acts if the company is responsible as well), I don't see how a low-level employee is bootstrapped into being the covered entity itself.
[ Tuesday, June 28, 2016 ]
Tex. Health & Human Services Commission Breach:
Jeff [4:00 PM]
The HHSC's records vendor, Iron Mountain, lost some boxes
with records of 600 people who applied for benefits with HHSC.
In case you didn't know, the HITECH and Omnibus Rule changes to HIPAA's definition of "business associate" make clear that anyone who "creates, receives, maintains or transmits" PHI for a covered entity is a business associate. "Maintains" includes storage, so wherever a covered entity stores its PHI, whether it's a cloud-based server or Uncle Bob's Self Storage, the storage company is a business associate. Of course, self-storage places, that never intend to access the records in storage and don't even know what people keep in their storage lockers, really don't want to be BAs, and they sure don't want to sign BAAs. But have you ever seen the TV show Storage Wars? Stuff in self-storage facilities sometimes gets disclosed to the general public. Unfortunately, if you are a covered entity and you're using a self-storage facility, you must get them to sign a BAA, or find another facility.
There are facilities that will sign BAAs, and Iron Mountain is one of them. This is the first breach I've heard of involving Iron Mountain; hopefully it will be the last.
Hat tip: Virginia Mimmack
[ Monday, June 27, 2016 ]
Is the theft of NFL Players' medical records from a Redskins' trainer a HIPAA violation?
Jeff [5:50 PM]
Almost certainly not. But it is likely a violation of some state data protection laws, and almost certainly raises a data breach notification obligation.
A Redskins trainer left a backpack containing paper medical records, as well as a laptop with electronic medical records, of current and former NFL players in a locked car; the car was burgled and the backpack (and its contents) stolen. The laptop was password-protected, but the electronic data was not encrypted. This is not good. But it's also unlikely to be a HIPAA violation, mainly because it's unlikely there is a HIPAA covered entity involved.
No breach without a CE or BA:
The NFL itself, and the Washington Redskins specifically, are not health plans, health care providers, or health care clearinghouses. Therefore, they are not "covered entities" (or CEs) under HIPAA. The trainer is most likely a health care provider, which would make him/her a CE if he/she engages in electronic transactions of the sort regulated by HIPAA. These would be submitting billing to insurance, checking for insurance coverage and benefits, tracking payments, etc. I would be extremely surprised if he/she did so, since I assume he/she is paid by the Redskins for services provided.
It also does not seem likely that the NFL, the Washington Redskins, or the trainer were acting as a "business associate" (or BA) of some other CE in connection with the lost data. Without a CE or a BA, there can't be a HIPAA breach.
One possible caveat: the Players Association is all over this story. It is possible that the Players Association is structured in such a way that it (or a component of it) is a CE by virtue of being a health plan. I doubt that, since I doubt the PA pays or provides for medical care; I assume the teams pay for their own players' medical care. But that unlikely event is the only way I see HIPAA being involved here.
Employment records aren't PHI:
Even if there was a BA or CE involved here somehow, there's still the question of whether the data lost was "protected health information" (or PHI) under HIPAA's definition. The definition of PHI is extremely broad, and it's likely that this information could be PHI, but the definition does have an exception that might be applicable here. Namely, "employment records held by a covered entity in its role as employer" are specifically excluded from the definition of PHI. We don't know for sure, but it seems like the lost data might be "employment records."
Encryption is not required:
The article states, "Storage of data on unencrypted devices does not adhere to both local and federal medical privacy standards, including HIPAA, making the breach a potentially costly one for the NFL." Not true. Don't get me wrong; encryption is best practice, and I highly recommend it, not the least because HIPAA's breach notification provisions are inapplicable if the lost data is encrypted. But encryption is not required, and therefore storing data on unencrypted does not fail to meet a standard under HIPAA. Some states (MA for sure) have state-level encryption requirements, but it's impossible to tell from the article whether those state statutes would be implicated, or if the state regulators would be able to commence an enforcement action.
State laws may apply:
Depending on where the theft occurred, the states of residents of the affected individuals, the location of the responsible parties (are the Redskins actually in DC or in Maryland or Virginia?), and the location of the theft (Indianapolis), various state laws may be impacted. Some states have laws requiring reasonable security for personally-identifiable information; most have laws requiring the notification of individuals whose data has been breached. Those laws vary greatly, but it's pretty safe to say some would be implicated by this situation. Some do not require notification if there is little or no risk of harm from the breach, and it's possible that the NFL and the Redskins could come to that conclusion based on the fact that the data was password-protected; that wouldn't cure the problem with the paper data, though. Regardless, that's a fact-specific matter based on the reasonable conclusion of the parties involved. I would expect the NFL and/or the Redskins to notify all individuals involved, regardless of whether it's legally required or not.
[ Monday, June 06, 2016 ]
Jeff [11:30 AM]
(wrote this back in April, don't know why it didn't post): NY Med HIPAA Fine:
NY Med was a reality TV show filmed in NY hospitals. It's relatively famous because NY Presbyterian Hospital and ABC are being sued by the family of a man who was hit by a garbage truck and was dying in the hospital; the film crew filmed his plight, without his authorization. The show pixilated the man's face and included no identifying information, but some family members were able to determine that it was him, and they're now suing the hospital and ABC. It's unlikely that anyone would have been able to determine who the dying man was if not for his family's publicizing the case by filing suit. I believe that ABC has been released from the suit, but the suit goes on against the hospital.
OCR has now fined NY Presbyterian 2.2 million dollars
for this case and for a similar issue involving another individual.
University of New Mexico Hospital breach:
Jeff [11:17 AM]
A change in software led to invoice information on about 3,000
patients being sent to 18 incorrect addresses. Definitely PHI included in the improper disclosures, but none of the traditional identity theft markers like social security numbers.
ProMedica Michigan breaches: Two hospitals in Michigan
Jeff [10:57 AM]
operated by ProMedica are under investigation by HHS for breaches apparently involving employee snooping. Seven employees were involved; 3 were fired, the other 4 disciplined. About 3500 patients were impacted. None of the files were printed, which makes large-scale identity theft less likely (of course they could've been saved to a flash drive, but I'm assuming they ruled that out too). That makes it more likely to either be pure nosy snooping (although the number is pretty high -- can't imagine that each snooper would know 500 people in the hospital), improperly-restrained curiosity, or some less-nefarious intend, such as wanting to see if hospital policies are being applied evenly.
Jeff [10:47 AM]
[ Tuesday, May 31, 2016 ]
Jeff [3:32 PM]
HIPAA covered entities must be careful in responding to Yelp reviews
, good or bad (but especially bad). Just because the patient has posted his/her own PHI, doesn't mean the doctor, dentist, or other provider can.
You can, however, respond in a way that doesn't bring up a specific patient or discuss an individual's specific PHI. If a Yelp poster complains that "this provider did X," the provider can post that "my office policy is to never do X, and that I looked at all my files for all patient visits in the last year and could not find an instance of anyone doing X." But you shouldn't say you looked into that patient's files and didn't find X; in fact, you shouldn't even acknowledge that the poster is your patient.
Yes, it's unfair. Yes, the provider's hands are tied. But that's the way it goes.
[ Tuesday, May 24, 2016 ]
Jeff [11:05 AM]
Good News for Data Breach Defendants:
Jeff [9:56 AM]
a Pennsylvania appeals court
has upheld a trial court's determination that the class action route is inappropriate for litigation regarding data breaches. The claims are too individual, particularly where damages are so uncertain and hard to define.
[ Monday, May 23, 2016 ]
Often mentioned possibility comes to fruition: Kansas Heart Hospital
Jeff [1:27 PM]
got hit by a ransomware attack last week and paid the ransom to get their data back. The hackers returned for a second bite, but this time the hospital is not paying. Presumably "baby got backups."
Actually, this is not a re-encryption, but rather a refusal to give up the full decryption in response to the payment of the ransom
I've heard of this as a possibility, but this is the first time I've heard of a healthcare provider getting hit with a second ransom demand. In every other incident I'm aware of, the hackers did provide the encryption key. Of course, in some instances, not all of the data is recoverable; the process of encryption might overflow usable memory, so that the decrypted data is corrupted or incomplete, so even if the hackers give the correct key (or all the correct keys), it's possible some data would be lost. In this case, it sounds like the hackers intended to go for a second bite.
This is the example, though, that should make you think long and hard about paying the ransom, even if it's relatively small.
[ Wednesday, April 27, 2016 ]
Jeff [3:15 PM]
Sorry, @HHSOCR, this FAQ
is a thousand times wrong. NOTHING in HIPAA prevents a covered entity from allowing a media company from accessing PHI, as long as the use or disclosure in connection with that access is permitted by HIPAA. And nothing at all prohibits a covered entity (or a media company working on its behalf) from disclosing truly de-identified PHI (which, by definition, IS NOT PHI!!).
You can argue about whether it's truly de-identified; that's a fair argument. But there is no such blanket prohibition in HIPAA to support the statements in the FAQ.
Of course, you could draft a regulation to just that. But that requires actually following the law and the Administrative Procedures Act, publishing a proposed regulation, soliciting, receiving, and considering public comment, and publishing a final regulation. Sure, it's more work than firing off an FAQ. But it's the law. It's the way law is made.
Executive fiat is anathema to the American concept of government. Stop it.
[ Friday, April 22, 2016 ]
WOW! Lots (and I mean lots, or I'm just lucky) of physicians, dentists, hospitals, vendors, and others seem to be getting notices from OCR today indicating that they are on the audit list for the Phase II audits. Is today "match day" or is this just a huge coincidence?
Jeff [6:01 PM]
[ Thursday, April 21, 2016 ]
Raleigh Orthopaedic Update: @PogoWasRight was on the case
Jeff [1:55 PM]
back in 2013 when it originally happened. Sure enough, the BA was crooked and instead of converting the films to digital, dissolved the films for their silver content. Don't know if there was any improper disclosure, though -- if the vendor simply melted the films down, there would be no further disclosure. Still a stinging result for the practice -- they were victimized by a scam artist and lost all their x-rays, and then had a big HIPAA fine on top of it all. It's not clear to me that having a BAA would've prevented the incident at all.
Anyone know any more about this than what OCR is saying
? Their press release only says that they failed to have a BAA in place. It does not say that the business associate stole the data, improperly disclosed it, or anything. No indication of any harm at all, just failure to sign the BA? Seems extreme to fine someone $750,000 for that. . . .
Jeff [10:09 AM]
[ Wednesday, April 20, 2016 ]
According to Report on Patient Privacy, 64% of healthcare companies have cyberinsurance. But most breaches cost less than the deductible. Well, that's what insurance is for, folks: not the daily costs, but the big one.
Jeff [8:24 AM]
Raleigh Orthopaedic Clinic: Lack of a BAA
Jeff [8:10 AM]
results in a $750,000 fine.
Hat tip: the inestimable Dissent Doe (@PogoWasRight)
[ Tuesday, April 19, 2016 ]
US-CERT Ransomware Alert:
Jeff [10:26 AM]
The United States Computer Emergency Readiness Team at the US Department of Homeland Security has issued an Alert
about ransomware. Best takeaways seem to be things I've been saying all along: backups (good, fresh, tested, and remote); patching; virus protection; access restriction; phishing protection (training to not click on links). One thing I've been preaching that they don't touch: restricting internet-facing computers and reducing open ports.
I'll admit to two additional tips I haven't been harping on that are very worthwhile. The first is application whitelisting. This is a program where only approved applications may run on the network or on connected servers and computers. This can prevent a lot of potential problems, not just ransomware. When a bad program infects your system and tries to start encrypting files, the program won't be on the whitelist, so the operating system won't let it run. Of course, we can anticipate that hackers will adapt their encryption programs to run within commonly whitelisted programs, or write them to mirror such programs so they can appear to be whitelisted, but it will certainly prevent some, and is a good response in the here and now.
The second tip, which I've seen elsewhere, is to prohibit (or at least limit) the running of macros. You know I'm not a "1's and 0's" guy so I'm not sure how this works, but many viruses can hide in macros, so that a PDF or Word document can be the carrier of the virus. While may people know not to click on links to unknown websites or open .zip or .exe files, many think that Word and PDF files must be harmless. However, any file with a macro might be a virus carrier.
Finally, I could complain about how slow US-CERT is ("when seconds matter, help is only minutes away"), since we've been fighting ransomware like a wildfire for months. But at least they have responded, and I've got to admit that I got something out of it (app whitelisting) that I'll use in the future.
[ Monday, April 18, 2016 ]
Cloud computing and HIPAA:
Jeff [5:33 PM]
can you be HIPAA compliant it you use the cloud? Of course you can.
You can also violate HIPAA by using the Cloud. It's a tool; how you use it determines whether you're complying with your objectives.
[ Friday, April 15, 2016 ]
Jeff [11:41 AM]
Five thoughts that you can tease out of recent articles like this one
for dealing with cybersecurity threats:
- Old Software. If possible, stop using old outdated software. Sometimes you can't help it, because it's the only software that works for what you do, you can't afford to move to a new platform, etc., but if you can update your software, do so. If you're using Windows XP, you deserve what you get (sorry, but that's the cold hard truth).
- Patches. Whether you're using new or old software, keep your patches updated. All software has vulnerabilities, since the developers can't think of every possible weakness; that's why Zero Day exploits exist. Having a vulnerability isn't bad unless it's exploited, and most vulnerabilities won't be exploited on any given day. But over an unlimited number of days, every vulnerability will be, so you've got to limit the days the vulnerability is open. Bad patch management is a consistent feature of every ransomware incident I've been involved in.
- Connectivity. Limit connectivity whenever possible. You can't run your business if your systems can't talk to each other and to the outside world. The safest website in the world is one nobody can access; it's also the most worthless. So you need some connectivity; you need some internet-facing computers. But the more "doors" you have to the outside world, the more you need to protect, and the more that can be exploited. If you don't think you'll need that door, lock it. If you're sure you won't need it, brick it over (sort of like the concept of epoxying USB ports to keep employees from plugging in infected flash drives).
- Backups. Have good, usable backups. This means two things. First, you need to be generating backup copies of your important data as often as you can, or at least have the ability to recreate any changes made since the last backup. This may require re-keying data, so consider that when calculating recovery time. Also, consider retaining older versions of backups, to account for the possibility that the backup you've just made contains compromised data; for example, if an encryption program is already running and you don't know it, you could make a backup copy of encrypted data, which you could then save over the last good version of your data. Storage is cheap, so if you're doing daily backups, you should also keep a version from the prior week's end, a copy from the prior month's end, etc. Secondly, make sure those backups are virtually inaccessible. Again, in recent ransomware cases I'm aware of, the programs look for data files with names like .bac, .bak, or that include the word backup in them. They will encrypt your backups if they can get to them, so make sure they can't. If you have the data backed up, even if your files get encrypted, you can recover without paying any ransom by wiping your system clean and re-installing the backup data.
- Training. As Morgan Wright said at your presentation yesterday, training is like bathing, it's not a one-and-done proposition. But balance it: don't let "alarm fatigue" inflitrate your training efforts and reduce their effectiveness, but train often enough that your staff knows what the problems are, what the current threat vectors are, and what they should be on the lookout for.
Something to think about.
[ Thursday, April 14, 2016 ]
Jeff [3:13 PM]
[ Tuesday, April 12, 2016 ]
Florida Department of Health Breach:
Jeff [10:25 AM]
The medical information of over 1000 patients
at seven Department of Health clinics in Palm Beach County were compromised, but it's unclear how. Since it was the FBI that notified the Department of Health, it's entirely possible that they don't yet know what happened or how the data got out there.
[ Friday, April 08, 2016 ]
OCR's Second Round of Audits:
Jeff [5:20 PM]
what might they look like? A look at the Audit Protocols
should give you a pretty good idea of the specific questions they're going to ask. Be forewarned, there are a lot of questions.
[ Thursday, April 07, 2016 ]
Jeff [1:54 PM]
for app developers, there's always a question of whether your app is a medical device that needs FDA approval, whether it's subject to HIPAA, or whether other laws apply. The FTC has set up this handy tool
to help you figure out what land mines you need to avoid.
Of course, try not to cross the "creepy" line.
Jeff [1:46 PM]
Blogger: HIPAA Blog - Edit your Template