Wednesday, April 16, 2014

Not Again! When Anti-Virus Updates Go Awry, Microsoft Forefront and Hospitals?

Long-time readers will remember incidents such as the 2010 event when hospitals were stuck in an endless reboot cycle as a result of an automated update from McAfee gone awry. Also see the NPR report. At the time, a hospital in Rhode Island reportedly had to stop treating certain patients because of the computer malfunction, except for extreme cases like gunshot wounds.

On the heels of XP going out of support, it is happening again, now with Microsoft Forefront.

I am receiving reports from the hospital IT community that a problem in Microsoft Forefront is leading to down time of computers. If a hospital uses an anti-virus product or if a medical device integrates an anti-virus product, a sad risk is that the anti-virus product itself might cause denial of service. It is more difficult to deliver patient care when the computers go down.  It disturbs workflow too.

More technical details below.
Programmers are human, so it's not surprising that these problems arise from time to time. But shouldn't devices be resilient to such problems that are certain to happen again? The design controls of a medical device should ensure the device remains safe and effective even if the anti-virus product malfunctions. This is a key reason why I believe in analog, non-software methods to detect malware on high-confidence systems such as medical devices. Less integrated software, less complexity, less risk. Independent failure modes!

Wednesday, February 26, 2014

A Gentle Reminder to Dan Haley of Athenahealth on FDA and Software Updates

I noticed an article in the Boston Globe about an attempt to remove safety checks on certain medical device software.

"The industry asserts that excessive regulation of software changes, for instance, could hinder the continuous software updates that are required to fix bugs."

I'd like to share with Mr. Haley my now classic one page guidance document on FDA and software updates.

"'That would essentially kill the way we do business and kill our ability to continually improve our product for doctors and patients,' said Haley of Athenahealth."

Shouldn't the dialog instead focus finding methods to not kill patients with unsafe software as recommended by the Institute of Medicine?

Sunday, February 23, 2014

An Apple (Security Flaw) a Day Keeps the Doctor Away?

Unless you're living under a rock, you've probably heard of the critical security flaw across various Apple computing products ranging from web browsers and mail programs to certain versions of MacOS and iPad/iPhone/iFoo products. Apple has started to release patches, but they probably have a rough weekend in Cupertino. I am wondering if this flaw will change how hospital CIOs and CISOs think about BYOD in the operating rooms, clinical care, electronic health record management, etc.

Today at the HIMSS symposium on Medical Device Security Risks and Challenges, I had a conversation about physicians who demand BYOD products like iPads for delivery of patient care. Nothing fundamentally wrong with considering the benefits of BYOD, but what is wrong is blind faith and overconfidence in the trustworthiness of software. This conversation is all in the context of the critical security flaw across several Apple products, and for which Apple is scrambling to patch. The flaw allows a network adversary to mount a "man in the middle" attack, effectively defeating the security normally provided by SSL (layperson speak: that little lock symbol in your web browser). You can go to with your web browsers to test this particular flaw. Some organizations are recommending that people not use Apple Mail or the Safari web browser on wireless networks until Apple releases a MacOS patch.

The consequences may range from invasion of privacy (network adversaries reading your sending and receiving of mail and web browsing) to security issues (capturing long-term secrets, authentication cookies, and passwords transmitted using an unpatched device). What might be most disturbing is how fragile our computing systems are. A single line of code appears to have led to this flaw that effectively turns secure SSL-protected communication into unprotected communication. Things to ponder:

  • All software has security and privacy risk. Consider the consequences when the rug is pulled out from under your feet.
  • Failures are rarely independent. A single flaw can affect multiple product lines, causing havoc with continuity plans.
  • "reasonably secure" and "completely insecure" are indistinguishable at the surface. Manage the risk.