What happens when privacy violations are committed by devices inside us?
February 16, 2017 8:56 AM   Subscribe

Ross Compton of Ohio was charged with arson based partly on data collected from his pacemaker. A Florida woman's claims of sexual assault were undermined by data from her FitBit. Gizmodo explores what happens when privacy violations are committed by our personal electronics, including implanted medical devices.
posted by Existential Dread (41 comments total) 18 users marked this as a favorite
 
I still need to get around to this scifi story I've had in mind set a couple hundred years in the future about the trials and utter devastation of this poor guy with a medical condition that causes all his implants to be rejected, cut off from the world, dies of utter isolation.
posted by sammyo at 9:37 AM on February 16, 2017


when privacy violations are committed by our personal electronics

This seems an odd way to frame this when, in the Compton case, the police obtained a valid search warrant for the pacemaker data. That's how the system is supposed to work.
posted by Sangermaine at 9:48 AM on February 16, 2017 [7 favorites]


I don't want to end up tarred and feathered as someone who declares accusations of rape as false, but is wearing a FitBit and having the contents examined truly a violation of privacy? The framing of this post implies that somehow a trespass happened through a nefarious backdoor in the tech, when really it was the owner/user of the device who was just being unaware of what the device actually did. That's not a violation of privacy -- that's leaving the curtains open while you are naked.
posted by hippybear at 9:59 AM on February 16, 2017


I mean, there's an argument to be made that a warrant was required before examining the FitBit, but that's an invasion of privacy on behalf of law enforcement or whatever agency was looking at the FitBit, not something the FitBit did.
posted by hippybear at 10:00 AM on February 16, 2017


This seems an odd way to frame this when, in the Compton case, the police obtained a valid search warrant for the pacemaker data. That's how the system is supposed to work.

The system was not set up to deal with situations where tiny devices that leave data trails are physically implanted in our bodies and we can't remove them without dying.
posted by Hypatia at 10:06 AM on February 16, 2017 [13 favorites]


Seems like your pacemaker, or any implantable device, should be covered under a right to avoid self-incrimination. If you can't be forced to testify in a trial against yourself, I don't see how your own body parts can be compelled to testify against you.
posted by COD at 10:10 AM on February 16, 2017 [13 favorites]


I don't see how your own body parts can be compelled to testify against you.

Your DNA/fingerprints can be used against you.
posted by dazed_one at 10:13 AM on February 16, 2017 [2 favorites]


"Your DNA/fingerprints can be used against you."

Which is also a compelling reason not to use fingerprints as a password to any system. Using Biometrics as a password is a huge security no-no, and yet people do it all the time in the name of simplicity.
posted by mystyk at 10:17 AM on February 16, 2017 [2 favorites]


Which is also a compelling reason not to use fingerprints as a password to any system. Using Biometrics as a password is a huge security no-no, and yet people do it all the time in the name of simplicity.

I was just suggesting that if biological residue from a crime can be used to convict someone, then biological data collected by a device can as well. I don't see enough of a difference between footage collected by a body camera and data collected by a FitBit to justify the use of one to convict without justifying the use of the other in similar circumstances. When the device is inside oneself like a pacemaker, the situation is somewhat different, but is it to enough of a degree? I'm not sure.
posted by dazed_one at 10:25 AM on February 16, 2017 [2 favorites]


Calling these "privacy violations" seems a bit loaded, or at least odd. I don't think you really have any right to or expectation of privacy when committing a crime particularly if you've already consented to that degree of data collection under more normal circumstances. I mean, that's sort of a weird idea: if you consent to Fitbit recording your steps throughout your normal day, but then you get upset because it did the same thing when you were engaged in criminal activity? Aside from "because it got me caught and I'm mad about getting caught", I don't really see a justification for being angry there.

There are some pretty legitimate privacy concerns concerning modern devices, particularly "Internet of Things"-type devices (which are the worst), or the way in which basically any device that broadcasts a unique ID (e.g. cellphones, Bluetooth, RFID tags, etc.) can be used for deanonymization and large-scale distributed tracking over time/space, but looking at cases where technology inadvertently uncovered criminal acts through the lens of the perpetrator's privacy is... an interesting framing choice. I'm not sure how well that is going to resonate with most people if your goal is, as I think many people in the technology industry are interested in, raising awareness of what can and should be done to harden these systems against abuse.
posted by Kadin2048 at 10:25 AM on February 16, 2017 [2 favorites]


Calling these "privacy violations" seems a bit loaded, or at least odd.

Well you're basically creating a class of people (ie those who need pacemakers) who can be tracked by the state. If the state required you to carry something that meant they could track you wouldn't you think that was a privacy violation? Saying they consented to it is largely meaningless since no one can sensibly be expected to decline something on which their life is dependent.
posted by biffa at 10:29 AM on February 16, 2017 [13 favorites]


Your DNA/fingerprints can be used against you.

Indeed. "[E]ven though the act may provide incriminating evidence, a criminal suspect may be compelled to put on a shirt, to provide a blood sample or handwriting exemplar, or to make a recording of his voice." US v. Hubbell. As Oliver Wendell Holmes, Jr. put it:

"[T]he prohibition of compelling a man in a criminal court to be witness against himself is a prohibition of the use of physical or moral compulsion to extort communications from him, not an exclusion of his body as evidence when it may be material." Holt v. US.

Later cases developed "communications" to mean specifically communications that are "testimonial in character." The Fifth Amendment right to silence is strong, but narrow. It is a right not to communicate the contents of one's mind. It is not a right to refuse to produce or stop the production of non-testimonial evidence.
posted by jedicus at 10:30 AM on February 16, 2017 [4 favorites]


I mean, there's an argument to be made that a warrant was required before examining the FitBit

The article mentions that the owner provided their username and password to the Fitbit account, which presumably is equivalent to a consent-to-search. (If the username and password were obtained coercively, then I could see a different argument for making the resulting information inadmissible, but that's not really a privacy issue so much as it's a police-powers / investigatory process one. "Consent" in the context of being investigated for a crime is not straightforward, but that doesn't seem to be the issue at hand.)
posted by Kadin2048 at 10:31 AM on February 16, 2017


I don't think you really have any right to or expectation of privacy when committing a crime

Are you an American? Please tell me you're not an American.
posted by praemunire at 10:32 AM on February 16, 2017 [4 favorites]


The article mentions that the owner provided their username and password to the Fitbit account, which presumably is equivalent to a consent-to-search.

Yes, but the framing of this post is somehow accusing the device in question (whether it be a pacemaker or a FitBit) as somehow violating their privacy. I think this is a false framing.
posted by hippybear at 10:33 AM on February 16, 2017


I can see an argument for preventing implanted medical device data from being used against you based on self-incrimination, but the FitBit is no different than a cellphone.
posted by grumpybear69 at 10:34 AM on February 16, 2017


With what I've heard about the security and general practices at various Internet of Things companies, yeah, I'm pretty fucking hesitant about data from a device designed with one goal in mind being considered valid for use in criminal trials. How important is it if your exercise data is a little off, day to day? Probably not that much. Oh wait, what if you're on trial for murder?

Read up on pacemaker security sometime. How hard is it to fake those logs?
posted by ODiV at 10:36 AM on February 16, 2017 [1 favorite]


In case 1: Police got a warrant.
In case 2: The accused handed over her account username/password voluntarily.

Unless you're going to suggest the latter was coerced, then the violation of privacy charge rests on shakier ground, or at the bare minimum is clearly not a violation of due-process. If the police obtained the data improperly, that's one thing, but I'm not seeing evidence of that. Of course, I'm in favor of strong privacy policies, in part to force police to do things the right way no matter what.

Now, the one substantive difference is that one was a voluntarily-kept device while the other was critical to the individual's life. They didn't have a choice to have that on them, and couldn't take it off/out prior. I'm not sure what the solution would be to medical devices calling home. Maybe it shouldn't phone-home over everything, but only over alert-worthy circumstances, and the rest should be stored locally rather than transmitted?
posted by mystyk at 10:36 AM on February 16, 2017


Which is also a compelling reason not to use fingerprints as a password to any system.

This is why my Android phone (a Nexus 6P) at least requires me to actually input the code on restart or if it's been a sufficiently long time since it's been unlocked. If you're privacy-minded, it's not impossible to still use some of these things, but you have to be aware of exactly what they're capable of. There is a serious problem with people not actually being aware of that, though.

What scares me more than "devices are collecting information" is the idea that we're at the point where someone might get convicted of arson based on "we don't think he could have done all this stuff during the time indicated because of this information", because I don't think law enforcement in basically any part of the US is actually well enough funded for them to do this kind of data analysis in a way that I'd actually trust it much more than speculation.
posted by Sequence at 10:42 AM on February 16, 2017 [1 favorite]


There is a serious problem with people not actually being aware of that, though.

There is a strong argument to be made for a spiral curriculum beginning in early elementary school and becoming more complex and nuanced until high school graduation of what I will call "Internet And Device Literacy", which teaches about what devices do, how they work, what choices can be made, how to engage online, how to be intelligent about what you engage with, and etc.

Of course, if such a thing were to be taught, it would slowly lose funding and space to standardized testing, much like how civics (the basic functioning of the US government) isn't taught anymore, either.
posted by hippybear at 10:46 AM on February 16, 2017 [1 favorite]


Weirdly I had a passing horrible thought where the states start forcing people on welfare to wear fitbits to track them and make sure they're going to find work. Or people on disability forced to wear them to prove they aren't too active. Sometimes I hate my brain's knack for playing the game of, "Well, how much worse can we make this story?"
posted by 80 Cats in a Dog Suit at 10:48 AM on February 16, 2017 [2 favorites]


Well you're basically creating a class of people (ie those who need pacemakers) who can be tracked by the state. If the state required you to carry something that meant they could track you wouldn't you think that was a privacy violation? Saying they consented to it is largely meaningless since no one can sensibly be expected to decline something on which their life is dependent.
biffa

But the state still has to go through the normal warrant process, which they did in the case in question. Of course different people will have different circumstances that open them in differing ways to potential monitoring, but all people (in theory) still have the protections of the Fourth Amendment, so what exactly is the issue? If the state wants pacemaker data they have to get a warrant for it just as they would need a warrant to search your office for old-fashioned written files.

The system was not set up to deal with situations where tiny devices that leave data trails are physically implanted in our bodies and we can't remove them without dying.
Hypatia

But it was. It wasn't set up specifically to address technologies that the Framers couldn't even imagine in the 1780s, but the system does provide checks and safeguards for searches in the form of the warrant process. How or why is the data from a pacemaker a special class that can't be searched even through proper procedure, which requires the state to show the need for and probability of finding the information on it?
posted by Sangermaine at 11:15 AM on February 16, 2017


With what I've heard about the security and general practices at various Internet of Things companies, yeah, I'm pretty fucking hesitant about data from a device designed with one goal in mind being considered valid for use in criminal trials. How important is it if your exercise data is a little off, day to day? Probably not that much. Oh wait, what if you're on trial for murder?

Read up on pacemaker security sometime. How hard is it to fake those logs?

ODiV

I wanted to address this separately because it's a different issue in this matter. This isn't an argument against using this type of data at all, but a matter of ensuring that such data is properly analyzed and utilized. Unless this type of data is so inherently difficult to use that there's no way to properly utilize it, methods can be developed to safeguard against the issues you raised just as with DNA or other types of evidence. For example, to address your last question about falsified logs, there are already methods to guard against the falsification of electronic records.
posted by Sangermaine at 11:23 AM on February 16, 2017 [1 favorite]


Are you an American?

I am (although I don't think that's especially relevant, tbh; we're all dogs on the Internet); without sidetracking the discussion too much, I think it's worth mentioning that the "right to privacy" as a matter of legal theory is not, uh, entirely uncontroversial, although like a lot of people who find it to be questionable as a matter of theory, as a matter of practice I'm firmly on the stare decisis side of the fence, since unwinding it as a practical construct, on which a small but very important number of rights depend, would be too damaging at this point simply for the benefit of theoretical correctness. But even taking that into consideration, much of what people think of, at least in my experience, as the "right to privacy" is really not related. E.g. the rules around search-and-seizure don't come from the "right to privacy" in the classical sense (i.e. from Griswold or the earlier Warren and Brandeis framework) but from limits on police powers which are necessary to prevent governmental overreach and abuse, and have as their goal not "privacy" but the distinct goal of "security" as enumerated in the Fourth Amendment. (You can, of course, derive an unenumerated right to privacy via the penumbra argument, such as used in Olmstead and Griswold, from the Fourth Amendment, but doing so doesn't really change search-and-seizure or other areas of law that flow directly from the Fourth Amendment.) Similarly, while one could also derive a similar rule against self-incrimination as exists in modern law via either a penumbra argument or common law, it's not necessary because the Fifth Amendment enumerates it directly.

So, yeah, I am at the very least highly skeptical that a person in the commission of a crime has a significant "right to privacy" that's distinct from, say, their Fourth or Fifth Amendment-derived rights, and would provide some sort of general protection of data on electronic devices — but that's very different from saying that I don't think they don't have any rights in that situation at all, particularly as it relates to their interaction with the police and the judiciary. But I certainly understand that there's a lot of room for argument, and that reasonable people can disagree (at least on the theoretical side of things; I'm less inclined to assume good intentions on the part of those who want to overturn Griswold etc. as a practical matter) on the scope and applicability of a "right to privacy".

But this is perhaps all tangential, because as long as you don't think that engaging in a crime somehow entitles you to additional privacy rights which aren't operative when you're just walking around and not committing crimes, the Fitbit situation isn't really problematic. The device did what it was supposed to do, and I don't see it being much different from the occasional situation where someone decides it's a good idea to film themselves assaulting someone. FWIW, I don't really see a parade of plausible horribles coming out of the police extracting data from personal electronic devices even without consent, as long as they obtain a warrant as with any other search, and I actually find the nonconsensual-but-warrant-obtained case less potentially slippery than the no-warrant-but-I-swear-Your-Honor-they-consented one.
posted by Kadin2048 at 11:29 AM on February 16, 2017 [2 favorites]


hippybear: "I don't want to end up tarred and feathered as someone who declares accusations of rape as false, but is wearing a FitBit and having the contents examined truly a violation of privacy? The framing of this post implies that somehow a trespass happened through a nefarious backdoor in the tech, when really it was the owner/user of the device who was just being unaware of what the device actually did. That's not a violation of privacy -- that's leaving the curtains open while you are naked."

And the user volunteered their login information to law enforcement, further negating the read that this is some sort of evil LE hack.
posted by Samizdata at 12:18 PM on February 16, 2017 [1 favorite]


the Fitbit situation isn't really problematic. The device did what it was supposed to do, and I don't see it being much different from the occasional situation where someone decides it's a good idea to film themselves assaulting someone

in the particular case of the Fitbit maybe but - my phone also tracks my steps and while it can be disabled I'm not sure it asked first. You could perhaps argue this is something to take up with device manufacturers more than LE (though for medical devices I think it becomes questionable in other ways because medical privacy is another thing).
posted by atoxyl at 12:19 PM on February 16, 2017


I wanted to address this separately because it's a different issue in this matter.

Yeah, that's definitely true. I kinda rambled off-topic there.

Unless this type of data is so inherently difficult to use that there's no way to properly utilize it, methods can be developed to safeguard against the issues you raised just as with DNA or other types of evidence. For example, to address your last question about falsified logs, there are already methods to guard against the falsification of electronic records.

Sure, but it sure seems like this data being used before any methods have been developed or safeguards have been put in place.
posted by ODiV at 12:25 PM on February 16, 2017 [1 favorite]


The pacemaker seems entirely different from the FitBit. One is a medical device required to live, and the records kept within are medical records that ought to be kept confidentially under HIPPA and/or doctor-patient confidentiality. The other is a personal computer.

It's not a matter of "did they get a warrant," but simply that the court should not have issued the warrant for the pacemaker.
posted by explosion at 12:26 PM on February 16, 2017


Note to self: before committing any crimes, strap cell phone and fitbit to the dog.

Prosecutor: And what were you doing on the 4th of September?
Me: I was just hanging out in my front yard chasing squirrels.
posted by 445supermag at 12:48 PM on February 16, 2017 [1 favorite]


I remember seeing an article (posted here?) about a guy who obsessively bought stuff and tried to get himself on security cameras because that was the only thing that saved him from being convicted of something very serious.

I should go find that.
posted by ODiV at 12:50 PM on February 16, 2017


and the records kept within are medical records that ought to be kept confidentially under HIPPA and/or doctor-patient confidentiality. The other is a personal computer.

It's not a matter of "did they get a warrant," but simply that the court should not have issued the warrant for the pacemaker.

explosion

HIPAA specifically allows for compliance with valid court orders, warrants, or subpoenas signed by a judge or magistrate.

I don't know Ohio's laws on patient-doctor confidentiality, but I'd be surprised if it doesn't allow disclosure of medical records with a warrant (California, for example, does). P-D confidentiality isn't a magic shield against all disclosures, just non-legal/authorized ones.
posted by Sangermaine at 12:54 PM on February 16, 2017


ODiV, I don't remember that as a serious article, but as a Patrice O'Neal bit (that is, stopping at a store every 30 minutes, buying something for $5, and getting a receipt, to prove where he was and when).
posted by furtive_jackanapes at 3:50 PM on February 16, 2017


This whole fit bit thing is why I've only decided to murder people from my couch.
posted by Nanukthedog at 5:25 PM on February 16, 2017


A big problem with this type of data is that it has this air of reliability, but for the purposes it's being used, it's actually just dressing up very subjective - and potentially wrong - opinions, or being used to find (or manufacture) inconsistencies in a person's evidence.

The pacemaker data doesn't say anything about the arson on its own. It was given to a cardiologist who felt that the accused wouldn't have been able to engage in the level of activity that he claimed to have engaged in after the fire started. That's it. Another cardiologist could look at the same data and reach a different conclusion. Hopefully the accused can afford a few thousand dollars for an expert witness.

Same thing with the assault victim. The Fitbit was just used to suggest that she wasn't asleep when she said she was. It doesn't necessarily mean she wasn't assaulted. For example, she could have been mistaken about the time she was asleep. Hell, she could be a sleepwalker.

I respectfully think that those who feel the invasion of privacy is justified are grossly underestimating how this data can (and will) be misused.
posted by AV at 6:12 PM on February 16, 2017 [4 favorites]


In America some rights (to a counsel, against self-incrimination or cruel and unusual punishment) are meant to apply to everyone, guilty or innocent.

Privacy is not one of those. The state is allowed to investigate people it reasonably thinks are guilty. It's right there in the 4th ammendment that you stop being secure in person and papers once there's a warrant. It'd be pretty ridiculous to be able say "stop trying to find out what I was doing on the night Bob was killed, it could embarrass me."

I do think there's a real potential for abuse (scanning random people's fitbits to add people to "known associates" lists) but the two examples in this case aren't in that category. (And evidence quality or misinterpretation is another matter, unrelated to privacy.)
posted by mark k at 9:25 PM on February 16, 2017


I found the article and unfortunately I remembered it wrong. Herman Atkins was convicted and only exonerated much later because of DNA evidence. Could have sworn I read it here first, but couldn't find the post.

When Atkins is out of the house and realizes that he has not bought anything for a few hours, he sometimes swings by a mini-mart to make a purchase so he can get a receipt. If the store has a surveillance camera, Atkins will make sure to walk past it.

A bit of a derail, but still relevant I think. This is about tech and electronic records potentially keeping you from being convicted rather than the opposite.
posted by ODiV at 9:26 PM on February 16, 2017 [1 favorite]


Same thing with the assault victim. The Fitbit was just used to suggest that she wasn't asleep when she said she was. It doesn't necessarily mean she wasn't assaulted. For example, she could have been mistaken about the time she was asleep. Hell, she could be a sleepwalker.

The FitBit data was only one piece of evidence in what is apparently a weird and complicated story. If the data were to show that she had never gone to bed, but had instead been sauntering around the entire night, perhaps showing energy levels consistent with drinking and upending furniture, but with no evidence of trauma, then that would be a pretty big deal. Sleepwalking wouldn't produce the same kind of data.

That said, we just don't know. We don't have the data and we don't (AFAIK) have the resources to know how well (or poorly) that data has been interpreted.

...

I wonder if things like FitBit data will become bigger deals for interrogations. Officer Friendly says, "you know, we pulled the data from your FitBit, it says XYZ. Why don't we work together on a statement just to put this to bed, let's not bother with lawyers..."
posted by Sticherbeast at 10:27 PM on February 16, 2017


I mean, that's sort of a weird idea: if you consent to Fitbit recording your steps throughout your normal day, but then you get upset because it did the same thing when you were engaged in criminal activity? Aside from "because it got me caught and I'm mad about getting caught", I don't really see a justification for being angry there.

There has to be a middle ground somewhere between "I consent to this computing device tracking thing I do for this purpose" and "I consent by default to law enforcement taking this capability computers have of tracking a behavior of mine and turning it into a case against me without a warrant ahead of time, like literally this doesn't go to law enforcement until they have a really good reason to believe I did some nasty stuff before they get the warrant."

I don't know enough about either case to say that the latter was the case in the two charges in the linked article, but I know I sure as hell don't want the government digging around my Fitbit because I posted something that amounts to "fuck Trump" on Facebook, and I don't want to be paranoid about using a Fitbit because of it.

I don't currently have that right, but I should.
posted by saysthis at 11:05 PM on February 16, 2017 [1 favorite]


like literally this doesn't go to law enforcement until they have a really good reason to believe I did some nasty stuff before they get the warrant.

I'm not sure what you're trying t say here because that's literally how warrants work. If the state believes that some information about you is needed to prosecute a crime you've allegedly committed, they get a warrant to look at the information.

That's as true of your bank information as it is with your Fitbit.
posted by Sangermaine at 11:37 PM on February 16, 2017 [1 favorite]


A big problem with this type of data is that it has this air of reliability, but for the purposes it's being used, it's actually just dressing up very subjective - and potentially wrong - opinions, or being used to find (or manufacture) inconsistencies in a person's evidence.

These aren't issues unique to this type of data, they've been a known problem with all sorts of evidence, and there have been a variety of responses to these issues.

Same thing with the assault victim. The Fitbit was just used to suggest that she wasn't asleep when she said she was. It doesn't necessarily mean she wasn't assaulted. For example, she could have been mistaken about the time she was asleep. Hell, she could be a sleepwalker.

It's not up to the alleged assailant to prove his innocence, it's up to the state to prove his guilt. The Fitbit data at least partially undermines her testimony. There could be valid explanations for the apparent inconsistency, as you suggest, but the Fitbit data served to harm the state's assessment of their ability to win the case (in combination with other circumstantial evidence further weakening her story).

It's not about showing she wasn't assaulted, but being unable to prove in court that the assault happened.
posted by Sangermaine at 11:44 PM on February 16, 2017


Well, it's a little different here because there is no particular accused assailant being investigated. She's apparently being charged with making false statements to police and for essentially creating a fake crime scene: now it's up to the court to prove that she actually did all this.

It sounds like the FitBit data was part of her first narrative unravelling. We in this thread don't know why the police are taking the otherwise unusual step of constructing an actual case against her: presumably, there is other evidence that was not recited in the articles. I don't know how FitBit data alone could be used to affirmatively show that she herself had been upending tables, etc.

Either way, she was the one who had given the police both consent and the user/pass for the device. No warrant was necessary. In and of itself, this is not that much more inherently novel than if she had had a camera recording herself that night, which she at first had claimed was lost, but then later handed over to the police.

Of course police in general can and do conduct illegal or improperly coercive searches all the time, but we don't know that that's the case here.
posted by Sticherbeast at 6:45 AM on February 17, 2017


« Older And a doll's house, that doubles as shelves.   |   Beware of the man who never dies Newer »


This thread has been archived and is closed to new comments