Deepfakes Are Not Victimless
February 13, 2023 8:23 AM   Subscribe

Recently, it came out that a male Twitch streamer had found and viewed pornographic deepfake videos of a number of notable female streamers created without their consent, being released on an OnlyFans style website where the creator was receiving payment for the videos. In a follow-up piece, several of the the women involved discuss how these deepfakes not only harm their public image, but also threaten their safety.
posted by NoxAeternum (33 comments total) 17 users marked this as a favorite
 
This area of regulation is so needed, on a state and national level. I don't want to take a conversation off the tracks....but...it feels like a whole new application of law to me. IANAL. It seems like a new technical requirement that judges must have in a branch of judiciary that specializes in these type of cases. This is not an drawing. This is not a painting. This is not a parody or satire. It just feels like we (in America) don't have the tools to bring justice quickly. I feel badly for the victims.
posted by zerobyproxy at 8:42 AM on February 13, 2023 [11 favorites]


I bet if there was a deepfake site of all the male white politicians being taken to poundtown by sexy daddies, they’d find some applicable areas of law to stop this pretty quickly.
posted by The River Ivel at 8:54 AM on February 13, 2023 [59 favorites]


I bet if there was a deepfake site of all the male white politicians being taken to poundtown by sexy daddies, they’d find some applicable areas of law to stop this pretty quickly.

If any white-hats out there are looking for a career...
posted by saturday_morning at 8:57 AM on February 13, 2023 [25 favorites]


I don't want to take a conversation off the tracks....but...it feels like a whole new application of law to me.

Except that it isn't - this is the reason that name, image, and likeness (NIL) rights exist, to allow a person to control how their image is (and isn't) used, beyond a few basic exemptions. Which is why the creator of the deepfakes went to ground as soon as he was caught out - because he had been taking money for these videos, he was legally exposed. (Most likely doubly so, as I doubt he had legal rights to the base videos as well.) But the problem with going after the creator is that this is playing whack-a-mole. What we need is to make hosting deepfakes legally fraught, because that is what will stem things. And the problem there is that trying to do so runs headlong into one of the greatest of the internet's sacred cows.

And that's the problem.
posted by NoxAeternum at 9:05 AM on February 13, 2023 [30 favorites]


It makes me sad that the existence of these is already such a given thing in some internet subcultures that the main scandal for many of fans was that their specific streamer looked at DF of someone they knew. Could just be me, but because of his presence in the story it feels like there is less space is made to focus on the perpetrators.


I bet if there was a deepfake site of all the male white politicians...

I can imagine the absolute frenzy when discovering real actionable fake news. Like the dog that caught the car. But also this absolutely exists for some of them. Maybe not computer/ai aided art, but certainly very vulgar art featuring our favorites.
posted by shenkerism at 9:08 AM on February 13, 2023 [1 favorite]


I bet if there was a deepfake site of all the male white politicians being taken to poundtown by sexy daddies, they’d find some applicable areas of law to stop this pretty quickly.

Dude, some of those male white politicians are making their own videos that show them killing other politicians. This probably wouldn't be the win we want it to be.
posted by EmpressCallipygos at 9:11 AM on February 13, 2023 [13 favorites]


I do not doubt that I will live to see deepfaked evidence used successfully in a legal conviction. We’re entering very scary waters on a number of levels.
posted by AdamCSnider at 9:17 AM on February 13, 2023 [7 favorites]


And I'll keep asking this question: How does truth work?
posted by elkevelvet at 9:40 AM on February 13, 2023


I do not doubt that I will live to see deepfaked evidence used successfully in a legal conviction. We’re entering very scary waters on a number of levels.

That seems like a lot of effort given how easy it already is to get bogus convictions (of ordinary people at least).
posted by srboisvert at 9:41 AM on February 13, 2023 [3 favorites]


It's only just started. We're not ready for what's coming.
posted by tclark at 9:43 AM on February 13, 2023 [3 favorites]


I bet if there was a deepfake site of all the male white politicians being taken to poundtown by sexy daddies, they’d find some applicable areas of law to stop this pretty quickly.

more like they'd fall all over each other to subscribe (anonymously, of course, for research)
posted by chavenet at 9:58 AM on February 13, 2023 [1 favorite]


I do not doubt that I will live to see deepfaked evidence used successfully in a legal conviction.

It has almost certainly already happened, it's been 5 years since the term was popularized and it's just a faster way to do a type of digital manipulation that people have been able to do manually for decades. It's not like security camera footage is very high resolution and the chain-of-possession for a lot of evidence used at trial is pretty weak.

I'm not convinced that deepfakes present some sort of unique problem to society, other than that the ease of creation means the scale of existing problems with fake content have increased. Weird horny dudes have been making photo manips of sexy female semi-celebrities for 20+ years at this point and it's already led to all sorts of problems for the targets. Most of those problems are caused by fans who already have a hard time telling truth from fiction. This has been a problem for "idol" celebrities for many decades because the media environment takes advantage of people (like myself) who have a hard time understanding different social contexts.

In general I think we need to socially encourage people to label fantasies (deepfakes as well as the nature of parasocial relationship that all streamers have with their audience) as what they are to help people understand the different contexts that exist in the modern internet. Female streamers have started to clearly identify the romantic relationships they already have and it doesn't really seem to hurt their connection with their audience. On the deepfake side, if people are exposed to more clearly-labeled deepfakes they will understand the concept and be less likely to trust ones that are not labeled.
posted by JZig at 10:30 AM on February 13, 2023 [2 favorites]


But the problem with going after the creator is that this is playing whack-a-mole. What we need is to make hosting deepfakes legally fraught, because that is what will stem things. And the problem there is that trying to do so runs headlong into one of the greatest of the internet's sacred cows.
I think about that a lot: like many of us here, I got online when “information wants to be free” was an overt mainstream position and rarely challenged at a fundamental level. That trained a generation of people to think of anonymity as a right and government restraint as an unmitigated threat, as we later saw with the negative reactions to things like European “right to be forgotten” laws, and I don’t think we’re at all prepared mentally for the imminent bullshit tsunami which AI has launched.

What’s worse is the certainty that we’re only going to see action against some of this since many of the same politicians who will condemn deepfake porn will be entirely willing to exploit its use against their opponents, and they’re not going to want the kind of mandatory ID checks which could expose other types of fakery or astroturf.
posted by adamsc at 10:52 AM on February 13, 2023 [1 favorite]


Weird horny dudes have been making photo manips of sexy female semi-celebrities for 20+ years at this point and it's already led to all sorts of problems for the targets. Most of those problems are caused by fans who already have a hard time telling truth from fiction. This has been a problem for "idol" celebrities for many decades because the media environment takes advantage of people (like myself) who have a hard time understanding different social contexts.

Tell me you didn't actually read the responses from the women involved without telling me. Because while stalkers get mentioned, they're pretty firm about the issue being that they aren't allowed to make choices about their sexuality without it being thrown in their face (see also: the attacks on Amouranth slowly unwinding the more explicit side of her streaming brand (which she has explained as being done so she can make sure her staff have jobs) after revealing the abuse she suffered.)

In general I think we need to socially encourage people to label fantasies (deepfakes as well as the nature of parasocial relationship that all streamers have with their audience) as what they are to help people understand the different contexts that exist in the modern internet.

Or we could teach men that they don't have a right to the bodies of women. Because that is the heart of this issue.
posted by NoxAeternum at 11:12 AM on February 13, 2023 [29 favorites]


I'm not convinced that deepfakes present some sort of unique problem to society, other than that the ease of creation means the scale of existing problems with fake content have increased. Weird horny dudes have been making photo manips of sexy female semi-celebrities for 20+ years at this point and it's already led to all sorts of problems for the targets.
The problem I see is when you take that last part and extrapolate it becoming orders of magnitude easier and better. The vast majority of those manipulated images were easy to tell as fakes for technical reasons, and they took a considerable amount of time and skill. Video was even harder, to the point where even a Hollywood budget likely only got you a few seconds with various limitations (i.e. it tended to be unconvincing unless you started with something similar in color, lighting, motion, etc.).

Now we’re seeing that go from artisanal to industrial scale, and that seems pretty significant to me. It dramatically increases the number of people who can use it maliciously — imagine how bad middle school could get when a kid with access to a gaming PC can instead of a couple amateur photoshop jobs produce hours of realistic video which requires careful analysis to show wasn’t actually their least favorite teacher saying something inappropriate or an unpopular kid doing something they learned about on 4Chan.

Many adults have probably learned to be cautious about anything they see in the October before a presidential election, but what does it look like when that level of sophistication can be brought to many races at every level? Giuliani was fortunately incompetent but it still helped a lot that the accusations about pay-for-play ran into the problem that as a public figure Joe Biden’s schedule was pretty well documented and nobody could find evidence of alleged meetings happening. That’s a lot harder when it’s some random city council member and pretty much all of the electronic records about the time in question either no longer exist or are only on their personal devices.
posted by adamsc at 11:20 AM on February 13, 2023 [3 favorites]


I recently watched the two-season fictional BBC show, The Capture, which is about surveillance technology and deep fakes. It was fascinating and frightening and probably how the future will go. It was like 24 crossed with Black Mirror and it creeped me out.
posted by dobbs at 11:31 AM on February 13, 2023 [3 favorites]


I bet if there was a deepfake site of all the male white politicians being taken to poundtown by sexy daddies, they’d find some applicable areas of law to stop this pretty quickly.

This comment betrays a misunderstanding of systemic misogyny. White male politicians have the privilege of protection from the vast majority of this. People are quick to assume that such photos would be fakes. Power protects them. Lindsey Graham can get away with whatever he wants in his personal sexual life because he's amassed so much power on the Right that the Right would rather protect him (turns out all their "beliefs" are so much bullshit after all).

Women that this happens to, though, are immediately assumed to be sluts, and then targeted with even more sexual violence than they already signed up for just by being a public figure who has the audacity to be a woman.
posted by rikschell at 11:38 AM on February 13, 2023 [28 favorites]


Women that this happens to, though, are immediately assumed to be sluts, and then targeted with even more sexual violence than they already signed up for just by being a public figure who has the audacity to be a woman.

If you want to see how this affects women in politics, there was the Daily Mail winning an anti-SLAPP case against former Representative Katie Hill when she sued them for publishing intimate photos of her provided by a vengeful ex. The judge ruled that these photos had prurience because Hill was facing an ethics violation over an inappropriate relationship with a staffer - a line of reasoning that made no sense beyond our cultural bugbears around sexuality and women.
posted by NoxAeternum at 11:47 AM on February 13, 2023 [7 favorites]


Also, a certain VP in 2008 targeted by Hustler, which predates deep fakes entirely.
posted by pwnguin at 11:53 AM on February 13, 2023


Lindsey Graham can get away with whatever he wants in his personal sexual life because he's amassed so much power on the Right that the Right would rather protect him (turns out all their "beliefs" are so much bullshit after all).

That's where I believe you're mistaken. The modern right in the US will protect him for the time being because he's using his power for their ends. But the millisecond they attain the kind of power they want, it'll be the night of the long knives all over again, and folks like Graham and Peter Thiel would be well advised to learn what happened to Ernst Rohm when the people he brought to power got all the power they wanted.
posted by tclark at 1:18 PM on February 13, 2023 [4 favorites]


What is appalling to me is how many men reacted to this story with, basically, "So what? You're overreacting. If I empathize with you at all, or listen to anything you're telling me, it's going to interfere with my sense of entitlement. Anyway, here are some unrelated personal attacks that prove to me that you deserve it, you slut."

That trained a generation of people to think of anonymity as a right and government restraint as an unmitigated threat

Anonymity protects a lot of people, not just the bad ones. The internet would be a lot more straight, white, and male without it.
posted by Kutsuwamushi at 1:23 PM on February 13, 2023 [10 favorites]


Tell me you didn't actually read the responses from the women involved without telling me. Because while stalkers get mentioned, they're pretty firm about the issue being that they aren't allowed to make choices about their sexuality without it being thrown in their face.

Yes, those are the exact types of problems I was talking about, this has been a really bad problem on Twitch for at least a decade now and has destroyed many people's lives (literally in several cases). I didn't intend to minimize the scale and severity of the problem.

Or we could teach men that they don't have a right to the bodies of women. Because that is the heart of this issue.

Anything we can do to effectively teach this is well worth it. This can be a difficult lesson to teach right now because of the very-strong cultural backlash against this message. Any female streamer (or otherwise marginalized person) who tries to push this message tends to get a LOT of punishment from reactionaries.
posted by JZig at 1:29 PM on February 13, 2023 [3 favorites]


Anonymity protects a lot of people, not just the bad ones. The internet would be a lot more straight, white, and male without it.

And yet its abuse continues to silence the dispossessed.

Nobody is saying get rid of anonymity completely - just that we should acknowledge that it's not inherently good, and that its harms make people look askance at it.
posted by NoxAeternum at 1:33 PM on February 13, 2023 [2 favorites]


Anything we can do to effectively teach this is well worth it. This can be a difficult lesson to teach right now because of the very-strong cultural backlash against this message. Any female streamer (or otherwise marginalized person) who tries to push this message tends to get a LOT of punishment from reactionaries.

This is why it's SO IMPORTANT for men, especially white men, especially straight white men to talk up this issue and give no truck with the Gamergate whiners and incels.
posted by rikschell at 2:32 PM on February 13, 2023 [14 favorites]


What we need is to make hosting deepfakes legally fraught, because that is what will stem things. And the problem there is that trying to do so runs headlong into one of the greatest of the internet's sacred cows.
This is your regular reminder that I think what NoxAeternum means here is stripping away liability protections for intermediaries (as well as stripping away anonymity protections).

Nox has been Metafilter's strongest advocate for this (and I've been one of the frequent sacred-cow-defenders).

I kind of hoped that we'd resolved this after this was actually attempted as a policy fix, with SESTA/FOSTA, which lowered those protections for sites for sex-trafficking offenses.

The godd news on Nox's side is that this days, Section 230 protections are much much less of a sacred chao as a bipartisan scapegoat, SESTA/FOSTA of course passed with support from both parties, as well as Facebook and the other tech companies. Even now, Section 230 is one of those rare topics that unites left and right in their criticism (although the reasoning behind the critiques are different).

The bad news, I think, is that SESTA/FOSTA showed what many of the law's defenders said would happen, did happen. I've talked about this at length elsewhere on the Blue, but I do feel I have to drop in occasionally and restate the case for the sacred cow: we've tried killing it, and it doesn't help, and makes things worse.
posted by ntk at 4:34 PM on February 13, 2023 [7 favorites]


I kind of feel like saying that deepfakes threaten women's safety is a framing similar to saying that someone was hit by a car. It's a true statement, but phrasing it that way obscures the agency of the person responsible for the actual offense. In this case, the people who see the deepfake and choose to take that as license to stalk, assault, harass, or disrespect the woman in question.
posted by vibratory manner of working at 8:45 PM on February 13, 2023 [2 favorites]


I kind of feel like saying that deepfakes threaten women's safety is a framing similar to saying that someone was hit by a car. It's a true statement, but phrasing it that way obscures the agency of the person responsible for the actual offense. In this case, the people who see the deepfake and choose to take that as license to stalk, assault, harass, or disrespect the woman in question.

Inanimate objects can definitely threaten people's safety and frequently need to be banned from public spaces.

The bad news, I think, is that SESTA/FOSTA showed what many of the law's defenders said would happen, did happen. I've talked about this at length elsewhere on the Blue, but I do feel I have to drop in occasionally and restate the case for the sacred cow: we've tried killing it, and it doesn't help, and makes things worse.

SESTA/FOSTA was a deliberate effort by an alliance of SWERFs and conservatives to run sex workers off the internet. It wasn't a well intentioned law intending to stop human trafficking that had unintended consequences, the shitheads backing it were well aware that it would kill a lot of sex workers before it was even put up for a vote.
posted by zymil at 12:50 AM on February 14, 2023 [2 favorites]


> Inanimate objects can definitely threaten people's safety and frequently need to be banned from public spaces.

I didn't claim otherwise! I think you may have misread my comment. This material *does* threaten people's safety because of the way it acts as incitement to violence. But when we ascribe violence to an object it's worth asking what we might be obscuring or normalizing by doing so. This is also not to say that it's wrong to ascribe violence to an object, or that it's necessarily wrong to do so in this case.

I compared the situation to the way we sometimes talk about drivers who injure or kill. In that case, saying that someone was hit by a car obscures the actions of the driver and normalizes car culture.

In this case I think that ascribing the safety risk to the deepfakes themselves serves in part to normalize the sexism and stigma that motive that violence. This is because the framing only makes sense given a presumption of that sexism and stigma. That's certainly the world we live in, but I think it's important to highlight that it's not the only possible world because failure to do so often lets bad actors off the (rhetorical) hook via normalization.

I think it's always worth talking about the sexism and stigma against sexuality directly. It might seem obvious given the premise, but not everyone who might agree that deepfakes threaten women's safety would agree that the stigma against women expressing sexuality is a bad thing. This isn't an argument against using language like "deepfakes threaten people's safety". It's an argument for being aware of the possible effects of that language when we do so, and looking for more precise language when more precise language is appropriate.

And in case there is any lingering confusion about my position I'll say this: in an ideal world, pornographic material (real or otherwise) would not incite violence against women. In such a world, free of both sexism and stigma against sexuality, there would still be good reason to regulate deepfakes on the basis of personality rights.
posted by vibratory manner of working at 11:28 AM on February 14, 2023 [5 favorites]


That trained a generation of people to think of anonymity as a right and government restraint as an unmitigated threat
Anonymity protects a lot of people, not just the bad ones. The internet would be a lot more straight, white, and male without it.
It’s also been invaluable to the mobs who’ve harassed many of the same people offline, or even into hiding. As I said before, I’m coming from a very strong pro-anonymity position but that’s been tempered after seeing the downsides. I don’t like seeing victims of abuse told that their best option is to sue someone whose hosting company won’t even identify or play whack-a-mole with thousands of social media accounts.

Finding a balance between the two seems especially pressing because we know both that there’s a flood of bad content coming and that there’s a concerted right-wing push to criminalize things which they don’t approve of, and they’re certainly not above pretending to care about the real abuse as a tactic in support of the culture war.
posted by adamsc at 11:39 AM on February 14, 2023


I didn't claim otherwise! I think you may have misread my comment. This material *does* threaten people's safety because of the way it acts as incitement to violence. But when we ascribe violence to an object it's worth asking what we might be obscuring or normalizing by doing so. This is also not to say that it's wrong to ascribe violence to an object, or that it's necessarily wrong to do so in this case.

Apologies, I did misread your comment.

In this case I think that ascribing the safety risk to the deepfakes themselves serves in part to normalize the sexism and stigma that motive that violence. This is because the framing only makes sense given a presumption of that sexism and stigma. That's certainly the world we live in, but I think it's important to highlight that it's not the only possible world because failure to do so often lets bad actors off the (rhetorical) hook via normalization.

You're right, but I think that normalising the sexism and stigma via completely banning them and the tools that make them is our only remaining option because no authority has the resources or will to punish the people who misuse them beyond maybe kicking the creators off a payment platform.
posted by zymil at 2:42 AM on February 15, 2023


It’s also been invaluable to the mobs who’ve harassed many of the same people offline, or even into hiding. As I said before, I’m coming from a very strong pro-anonymity position but that’s been tempered after seeing the downsides. I don’t like seeing victims of abuse told that their best option is to sue someone whose hosting company won’t even identify or play whack-a-mole with thousands of social media accounts.

Or, as in the case of SF author Patrick Tomlinson, being told that he has no legal recourse because Section 230 precludes being able to get the information to actually go after his abusers legally. It's worth noting that Tomlinson actually approached the matter in the way many Section 230 advocates recommend - and in response was told that because he couldn't identify the host as being personally engaged, Section 230 not only prevented him from being able to subpoena the information needed to pursue legal action, but also held him liable for the legal fees incurred by the host. There's also the fact that there's an entire legalized extortion industry centered around these "protections" as well, and the discussion in the OP talks about how difficult and expensive it is to combat these issues because of a lack of laws regarding online harassment.

Finding a balance between the two seems especially pressing because we know both that there’s a flood of bad content coming and that there’s a concerted right-wing push to criminalize things which they don’t approve of, and they’re certainly not above pretending to care about the real abuse as a tactic in support of the culture war.

One of the points that I made discussing the FOSTA/SESTA runup was that Section 230 advocates undercut themselves by arguing that there were no problems with Section 230 - right when Backpage had just admitted that they had claimed Section 230 indemnification to evade liability for aiding human traffickers on the site. The reality is that the core underpinning of Section 230 (that the platform owner and user should be treated as inherently divorced) doesn't hold up (and it doesn't really take long to consider why it doesn't), and that underpinning is what results in travesties like what happened to Tomlinson. And as you point out, bad faith actors will use them to their own ends.
posted by NoxAeternum at 9:24 AM on February 15, 2023


Part of the challenge in Patrick's case (and I would say more generally in the places where people sincerely believe 230 is a problem) is that he is pursuing a civil case. Section 230 has never placed any liability protections against criminal acts: but law enforcement continues to lag behind how much many people would like it to pursue prosecutions that are internet-mediated.

It would provide absolutely no barrier for the police to discover who was using them as a criminal weapon in SWATting, nor stop any public prosecutor in identifying who was posting on a forum.

Meanwhile, Section 230 /does/ act as part of a series of legal protections, including anti-SLAPP laws, to civil law actions from being used to reveal the identities, addresses, etc. of defendants who may not be guilty of any crime, and for whom the court case may be being used merely to pierce their anonymity for retaliatory purposes.

I'm not saying that's the role it's playing in *this* case: but I still think a criminal prosecution makes more sense in these scenarios.
posted by ntk at 12:56 AM on February 22, 2023


I'm not saying that's the role it's playing in *this* case: but I still think a criminal prosecution makes more sense in these scenarios.

This comes across as naive at best, especially given what we know of police and prosecutors today. Tomlinson is actually lucky in that regard in that he has gotten law enforcement to take him seriously - after a campaign of harassment that involved several swatting attempts. But in many cases, especially in cases of intimate image abuse, law enforcement routinely turns a blind eye (and often blames the victims for trusting their abuser in the first place.) Not to mention that many times, part of the problem is that there are no laws to turn to in the first place, which means that there isn't a place for law enforcement in the first place.

Meanwhile, Section 230 /does/ act as part of a series of legal protections, including anti-SLAPP laws, to civil law actions from being used to reveal the identities, addresses, etc. of defendants who may not be guilty of any crime, and for whom the court case may be being used merely to pierce their anonymity for retaliatory purposes.

Which is not justification for turning a blind eye to actual abuses that have actual victims. Taking the example you give of anti-SLAPP statutes - I am a strong believer in them, I think that they protect free speech especially in cases where people are blowing the whistle on abuses and harmful conduct. But at the same time, I also acknowledge the Daily Mail winning their anti-SLAPP countersuit against Representative Hill was a travesty that actually runs counter to the purpose of the law, chills the environment for women in public office, and ultimately undermines support for anti-SLAPP laws. People are done being "collateral damage" here, and these abuses need to be taken seriously.
posted by NoxAeternum at 11:34 PM on March 1, 2023


« Older I am Trugoy/A Dove-like boy/Could wingspread/But...   |   The world doesn’t want boxes. That’s not what’s in... Newer »


This thread has been archived and is closed to new comments