Your brain: The next hacking frontier

This week, researchers unveiled worrying results about how easy it is to hack medical implants, such as brain stimulators.

The claim is that hackers are a decade or two away from being able to mess with our memories—the very essence of who we are. But neuromodulation is a promising branch of medical science, so it would be a shame if these worries were overblown, right?

Sci-fi it’s not, they claim. In this week’s Security Blogwatch, we’re even more scared than we were yesterday.

Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: Thought-provoking stuff about nitrogen 

Application Security Research Update: The State of App Sec in 2018

Happy Hallowmas?

What’s the craic? With a Black Mirror quip, it’s Shaun Nichols—That furious clicking you hear is Charlie Brooker frantically writing his next script:

Kaspersky Lab and the University of Oxford Functional Neurosurgery Group warn … that the brain stimulation devices used to treat disorders like Parkinson's and OCD carry with them security vulnerabilities that would potentially allow an attacker to manipulate the medical implants.

The vulnerabilities themselves are no different from those affecting other medical implants. [But] biomed companies are already looking to implants as a way to alter or recover memories to treat conditions like PTSD, and with the ability to directly affect the brain possible, an attack on a device would become far more dangerous.

Dangerous, you say? Charlie Osborne affects this explanation: [You’re fired—Ed.]

This is how hackers can wipe your memory and steal your thoughts. … It might seem like science fiction, but security woes in brain chips could make such attacks reality. … What if a hacker could … enter your mind at will, stealing information, tweaking or eradicating memories, or causing debilitating damage?

Hacking medical equipment—which can mean the difference between life and death—is not science fiction. Thankfully, there are not any known examples of such compromise at present, but this does not mean these attacks will not potentially happen.

The teams collaborated on a project which examines the security of Implantable Pulse Generators (IPGs), also known as neurostimulators, which are used to send electrical impulses to specific areas in the brain. Medical professionals use [them] to treat a range of problems … including Parkinson's, Obsessive-Compulsive Disorder, major depression, and tremors.

The latest generation of implants we currently use come with management software which can be accessed by both patients and clinicians and the systems interconnect through … Bluetooth. [Research] uncovered a range of existing attack scenarios which could be used to assault these medical devices.

It might seem like a concept born from Altered Carbon's stack technologies, but … cyberattacks might not always be limited to the physical when it comes to our health.

Okay, but what were the problems, specifically? The researchers rhetorically ask, science fiction or future threat?

[We] found existing and potential risk scenarios, each of which could be exploited by attackers. These include:
  • Exposed connected infrastructure – [we] found one serious vulnerability and several worrying misconfigurations in an online management platform popular with surgical teams.
  • Insecure or unencrypted data transfer between the implant, the programming software, and any associated networks could enable malicious tampering [which] could result in … pain, paralysis or the theft of private and confidential data.
  • Design constraints as patient safety takes precedence over security. For example a medical implant needs to be controlled by physicians in emergency situations, including when a patient is rushed to a hospital far from their home. This … means that by default such implants need to be fitted with a software ‘backdoor’.
  • Insecure behavior by medical staff – programmers with patient-critical software were being accessed with default passwords, were used to browse the internet, or had additional apps downloaded onto them. …
Within five years, scientists expect to be able to electronically record the brain signals that build memories and then enhance or even rewrite them before putting them back into the brain. … Within 20 years or so, the technology could be advanced enough to allow for extensive control over memories.

New threats resulting from this could include the mass manipulation of groups through implanted or erased memories of political events or conflicts. … Or ‘locking’ of memories (for example, in return for a ransom).

Many of the potential vulnerabilities could be reduced or even eliminated by appropriate security education. … Healthcare professionals, the security industry, the developers and manufacturers of devices and associated professional bodies all have a role to play.

So far so woolly. Can we hear from one of the actual academics? Laurie Pycroft—@Sqrrl101—obliges:

I found that being diabetic has given me a useful new hook for explaining my research. … I use a hideously insecure system for monitoring my blood glucose – I'm aware of its insecurity but use it anyway, because it offers such a massive improvement over the alternative. This helps me explain that I'm not warning about brainjacking because the tech is something to be feared. Rather, these implants are so useful that it's worth working hard to protect the users.

Extra credit exercise for attendees of my lectures: hack my implanted sensor and tell me how much my blood glucose has changed while I've been droning on. Winner gets their choice of candy or insulin.

The most concerning implications are many years away, but this is a risk that can be averted now if the stakeholders work together. … The risk to most individual patients today is negligible, though it's always difficult to strike the balance between appropriately serious warning while not exaggerating the threat.

[But] the implants aren't exactly "chips" – calling them that implies that it's a chunk of silicon embedded in the brain, whereas they're actually long wires that delve into the brain and connect to a implantable pulse generator (battery pack that contains the processor, etc.)

[Journalists] often don't quite get what effects brainjacking, and more generally benign neuromodulation, would really have. That's partly due to the difficulty of communicating complex neuroscience, though.

And Nicola Whiting—@CyberGoGiver—was in the audience this week:

It's frighteningly easy to hack implants and patient data. … I really enjoyed [the] talk … it was informative and insightful.

The potential for the "Internet of Minds" is as exciting as it is terrifying (especially given the processing power and spare capacity of the human brain compared to....say....IoT devices).

But who can think on the bright side? RichardT, for one:

But officer, I didn't break the law, somebody hacked my brain implant and made me do it!

What about the poor infosec scribblers? Let’s ask Ms. Smith:

I am going to be so ticked if the day comes when instead of still writing about ransomware hitting a company, the news will be about the memories of company personnel being locked and held for ransom. … If some people can’t regularly patch, upgrade firmware or even backup their business data, will they be able to backup … their memories?

What do people think who have medical implants? Here’s sqlartist:

I have one of these implants and while I'm really not concerned, it is true that if a hacker were to get access they could cause me to have a stroke. I have a setting range of 2.2 - 6.6 and currently set to 4.4 because of the level of pain I am currently in. When I went [in] for reprogramming last time they could not set the upper bounds therefore my patient programmer device could allow me to increase it to 6.0+ which is instant stroke time.

Wait. Pause. How sociopathic do you need to be to hack a medical implant? Pan-pan-pan, it’s Mayday:

It takes a special kind of **** to hack and otherwise mess with someone's life saving medical device.

Meanwhile, GrapeBunch jumps ahead with 11-word flash-fiction:

Help. A hacker overclocked my brain implant and it's now 2047.

The moral of the story: And you think your threat model shows there’s a lot at stake?

And finally …

Is this our most important discovery?


“It seems that almost everyone out there is making these wild promises and delivering … almost nothing. It’s all broken promises based on snake oil—empty dreams sold on science jargon by people who really don’t understand any of what they’re reporting. [But] people have forgotten what the world is actually like … they've just never experienced a world without science—a world that doesn't have supermarkets full of food.”

You have been reading Security Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites … so you don’t have to. Hate mail may be directed to @RiCHi or Ask your doctor before reading. Your mileage may vary. E&OE.

Image source: affen ajlfe (cc:0)

Topics: Security