slow-motion infocalypse
February 16, 2018 7:57 AM   Subscribe

Inside the two years that shook Facebook, and the world
The stories varied, but most people told the same basic tale: of a company, and a CEO, whose techno-optimism has been crushed as they’ve learned the myriad ways their platform can be used for ill. Of an election that shocked Facebook, even as its fallout put the company under siege. Of a series of external threats, defensive internal calculations, and false starts that delayed Facebook’s reckoning with its impact on global affairs and its users’ minds. And—in the tale’s final chapters—of the company’s earnest attempt to redeem itself.

He Predicted The 2016 Fake News Crisis. Now He's Worried About An Information Apocalypse.

George Soros: Facebook and Google a menace to society
Facebook asks some Hard Questions on Social Media and Democracy
Facebook’s Desperate Smoke Screen
With all this in mind, it’s not surprising that Facebook’s reaction to the Soros speech ignored the social issues and instead focused like a laser on the significantly more tractable alternative of the political issues.

In more detail, a few days after this speech, Facebook initiated a series of posts on their company blog about Facebook’s potential to harm democracy. This series includes essays from outsiders who have been publicly critical about Facebook’s impact on the political process.

As Facebook explains: “We did this because serious discussion of these issues cannot occur without robust debate.”

Yeah right.

This move is not purely an effort to confront Facebook’s problems, it is, I suspect, in large part a desperate attempt to distract the media and public from the social issues that Facebook knows it cannot resolve without inflecting serious self-harm.
Facebook Says Business Stays Strong Amid News Feed Changes

previously: Some Early Facebook Employees Regret The Monster They Created
posted by the man of twists and turns (49 comments total) 36 users marked this as a favorite
 
And the evidence for why engineers need education in the humanities keeps on piling up.
posted by NoxAeternum at 8:04 AM on February 16, 2018 [81 favorites]


Monetizing addiction should be highly regulated, because it always involves seriously expensive externalities that are often impossible to predict.

We already do this for things that involve obvious addictions with obvious consequences — alcohol, gambling, cigarettes, etc etc. But I would argue that a person’s time and attention are just as (if not more fundamental?) to their ability to live a fulfilling life. And they are limited resources. Deeply, deeply personal limited resources. There are entire industries with business models based on exploiting this.

I’m not sure we have ways to measure the damage this does yet, but it sure as hell isn’t zero.
posted by schadenfrau at 8:26 AM on February 16, 2018 [14 favorites]


5 completely disparate fonts in the first 4" of that first article.
posted by humboldt32 at 8:46 AM on February 16, 2018 [6 favorites]


The thing that disturbs me about all this is that the only fix I can really see is an anti-privacy fix: really hardcore verification before any person or entity gets any kind of social media or regular media account, something that would have to be managed centrally in a relatively transparent way, would almost inevitably be 90% functional and 10% oppressive clusterfuck and would end up being the absolute death knell of ever being able to say anything that wasn't basically anodyne. If everything was really posted by an identifiable entity, that would wipe out a lot of fake news - not all of it but a lot - and would make it a lot harder to spread dangerous stuff. But even there, it would end up empowering the wealthy, because lying and making threats would be within reach if you were not liable to lose your job/home/etc over them .

Also, I kind of don't think there's a fix; I think humans have been working toward cheap, bad addictive stuff on all levels for most of our "civilized" history, and we've pretty much got it down now on the information/food/drug/consumption fronts.

I think that there are basically no solutions for the problems ahead of us. They might get "solved" after enough infrastructure and capital are destroyed to weaken the existing power structures and break existing systems, if you can call "burn it down" a "solution", but that's about it. It's likely that in a hundred years or so there will be some livable post-climate collapse societies, but as the fellow says, "an infinite amount of hope - but not for us".
posted by Frowner at 8:47 AM on February 16, 2018 [7 favorites]


Halp me Mods, but here I go.

The MetaFilter hate on silicon valley engineers is starting to get to me. Do you have an idea of the size of Google and Facebook? Do you have an idea of how decisions are made?

There are classes withing these companies. The bulk of silicon valley engineers are lowly peons, trying to make rent, with no power or authority to change the companies direction. You can vote with your feet and quit, but I think I don't need to explain here why that is not an option for many. Specially everyone in T and HB visas.

You may hate them because they are 25 years old making 140k a year, but having done my time there I can tell you that it is like being paid in company scrip.

Maybe think of the people who actually have the power to make decisions. CEOs, VPs, some directors. The less than 3% of engineers who manage to get promoted to senior staff or above.

As a lowly peon I worked with software engineers who cared. A philosopher and maths teacher who taught for free at a nearby school, a couple of activists who's families had come as refugees to the US from actual dictatorships, different groups of minority engineers getting together to do some real work on social issues inside and outside the company. Most of our non-work conversations were about "the humanities".

But we get the specs and the quarterly goals from high up. And we see that they are evil and toxic, and we raise concerns, and the manager has their hands tied, and so does their manager. And when you finally get a meeting with your directors and VP they very calmly and not condescendingly at all explain to you all the very sound strategic and business reasons behind their decisions, and how if you don't implement whatever feature they ask for the company will be hurt, and the hurt will trickle down, and you won't get your bonus and your promo. And maybe if you don't find your job personally significant and fulfilling maybe you don't really want to work here at all.

I finally put myself in a position where I could quit, took me a couple years of planning and execution. But I feel for some of my colleagues, who are stuck with mortgages and obligations they can only afford with a high paying engineering job.

When I left my goodbye email was a link to Are We The Baddies.

So yeah, please stop grouping me with the VPs who make millions or the manifesto writing engineer shitlords of silicon valley.
posted by Index Librorum Prohibitorum at 8:48 AM on February 16, 2018 [82 favorites]


Facebook, YouTube, Twitch, Twitter and all the other sites that rely on user generated content with eventually have to face the fact it is impossible to have a completely neutral service where users’ actions don’t reflect back on them. If your service is being used to share pictures of dogs then you are a dog picture sharing service - great, take credit for that; if your service is being used to promote hate or harassment then you are to blame.

The answer is (as it has always been) editorial control with moderation. The only way I see to fix Facebook is to break it up into small sites each leaning a different way politically and heavily moderate/ban accordingly.
posted by AndrewStephens at 9:02 AM on February 16, 2018 [2 favorites]


So yeah, please stop grouping me with the VPs who make millions or the manifesto writing engineer shitlords of silicon valley.

I'm a techie too, working as a developer. And the sad reality is that a lot of this crops up because of the massive blind spots the tech culture has, especially where it comes to managing human interaction with algorithms. As the one guy in the story pointed out, the editorial team was never treated as actually being part of the company, and everyone knew that they were on the chopping block as soon as Facebook figured out how to replace them with an algorithm.

This is a cultural problem in the tech community, and we have to acknowledge that.
posted by NoxAeternum at 9:19 AM on February 16, 2018 [37 favorites]


And the evidence for why engineers need education in the humanities keeps on piling up.

I am a software engineer with some pretty serious grounding in the humanities. Early in the expansion of the internet (to give some context I once accidentally took the country of Slovenia offline for 3 days and no one particularly complained) I often thought of the Gutenberg press. Egalitarian access to information has been a blessing and a curse to Western civilization, but most people consider that it was a good thing in the long run.

I've since decided that information doesn't really solve human problems. Magically giving every Israeli and every Palestinian a PhD level education will not resolve their differences, for example. But strictly speaking, a good grasp of history can easily lead to the conclusion that it is better for people to share than not share.
posted by Tell Me No Lies at 9:20 AM on February 16, 2018 [12 favorites]


People aren't connecting with people, they're connecting with an abstraction, a large machine that they talk to and at and that doesn't give them any reason to be generous or thoughtful. Why wouldn't you vent your anger at that, or react in the moment, or do whatever you could to get some reaction, any reaction?

The scale of the thing is not simply a quantitative effect. It's qualitative. It's a different experience entirely to being part of a real human community.

Even a "community" isn't what's really needed. It's people connecting to individual people, knowing them as friends, knowing that these other people are sometimes one way, sometimes another way, and that people can and should change moment to moment, that every single individual has a whole universe of inquisitiveness and defensiveness and kindness and hope and hate and fear and joy and desire to help others, all of these things, at different times, and that the parts that we experience from other people change depending on things we can't control and on things we can control to surprising degree.

That idea, that people aren't static collections of qualities, is missing from the discussions I've seen. The models we are using are overly simple, and to try to make the world work with them, you have to force the world, and the people in it, to behave more simply than they should. That is cruel.

Social media seems to be building things on top of other things that were never thought through thoroughly enough. Individual humans have learned to handle the complexities of other people, well, beautifully, through lessons transmitted indirectly from person to person - but that information transmission isn't happening in a clear way that people outside that knowledge can see. There's no certificate in "understands people really" that has a clear curriculum that people on the outside know to respect.

This seems like it might be an approachable problem, though.
posted by amtho at 9:27 AM on February 16, 2018 [12 favorites]


People like Alex Hardiman, the head of Facebook news products and an alum of The New York Times, started to recognize that Facebook had long helped to create an economic system that rewarded publishers for sensationalism, not accuracy or depth. “If we just reward content based on raw clicks and engagement, we might actually see content that is increasingly sensationalist, clickbaity, polarizing, and divisive,” she says. A social network that rewards only clicks, not subscriptions, is like a dating service that encourages one-night stands but not marriages.
Wow. That's QUITE the epiphany to have in that position at this late date. I guess the existence of terms like "clickbait" and the increasing popularity of content-lite "listicles" wasn't enough of a clue.
posted by xyzzy at 9:28 AM on February 16, 2018 [23 favorites]


But we get the specs and the quarterly goals from high up. And we see that they are evil and toxic, and we raise concerns, and the manager has their hands tied, and so does their manager.

Software engineers need to own up to their responsibility. I've implemented Lawful Intercept for voice and data (twice), as well as Deep Packet Inspection engines with a specific use case of shutting down all social media for civilians in a country while letting the government freely use theirs. The possibilities for evil are endless.

I'm not going to hide behind economic necessity for doing that. Capitalism may be a way of life, but it isn't an excuse. What I did I did out of interest and for cold hard cash. I created powerful tools that can be used for all sorts of good and bad. If people think I did the wrong thing, then I'll take that.
posted by Tell Me No Lies at 9:31 AM on February 16, 2018 [22 favorites]


It is the defining foolishness of our age -- not the foolishness of our age's fools but the foolishness of its sages -- that the distance between any geographical or figurative points a and b can be shortened by placing enough computers between them.
posted by gauche at 9:35 AM on February 16, 2018 [3 favorites]


And that when the gap refuses to close, the problem is "tribalism", and not that there are genuine disagreements causing the gap.
posted by NoxAeternum at 9:38 AM on February 16, 2018 [1 favorite]


Social media is a convenient scapegoat for larger social problems
posted by Noisy Pink Bubbles at 9:39 AM on February 16, 2018 [1 favorite]


> Social media is a convenient scapegoat for larger social problems

Well, sure, but the point is that it is also creating social problems unlike any that have been seen before.

Social media allows knitters all over the world to get together to share tips and form their own vibrant community, but it also allows InCels (to pick a current example) all over the world to come together and feed off of each other's hate and egg each other on in ways that were not possible before.

Sure, they could have been penpals before, or had their own magazine. But that's such an immense difference that it's not even comparable.
posted by RedOrGreen at 9:46 AM on February 16, 2018 [11 favorites]


And the evidence for why engineers need education in the humanities keeps on piling up.

I'll assume that's not intended to sound condescending. I would argue that humanities students would likewise be more well-rounded citizens if their curricula included some introductory STEM courses.
posted by ZenMasterThis at 9:48 AM on February 16, 2018 [8 favorites]


I don't just mean between people at points a and b, either. I mean goals. I mean getting to this quarter's financial targets. I mean that basically every business plan involving the internet has this hidden somewhere in its equations:

1. Steal Underpants
2. 💻💻💻💻💻
3. Profit!
posted by gauche at 9:49 AM on February 16, 2018 [2 favorites]


And the evidence for why engineers need education in the humanities keeps on piling up.

> I'll assume that's not intended to sound condescending. I would argue that humanities students would likewise be more well-rounded citizens if their curricula included some introductory STEM courses.


Yeah, as an engineer it does sound condescending, especially if one isn't aware of what a Computer Science curriculum actually consists of. But as a CS undergrad, within the University of California school, under the engineering department...we actually had to take humanities courses. There was at least one topically science based ethics course and we had to take, and mandatory electives. So maybe not as many as business majors or those in the humanities college... but um... yeah the same could be said about business schools needing more humanities classes so they don't create or manage scummy companies.

So yes, you may have ignorant software engineers that are interested in just programming and not into the ramifications of what they are building... but honestly what control does a regular cog engineer have? There is no software trade union. As a half-joke, programmers can't even agree on text editors so how do you think they are going to rebel? Not to mention if engineers quit, it then gives all the more leverage on big-co FB and Goog to hire H1-Bs which are really at the mercy of their employer.

I am NOT detracting from the argument that it would be better if more engineers have more of an understanding of social values, ethics, etc. I have very strong opinions about the implicit bias in machine learning, ai, the "data science" field, etc. But aren't we waging the same argument about scientists working in big pharma labs? Scientists and engineers need to feed their families. It's a privilege to be able to choose your job, and especially if it's a high paying one. I understand my privilege and have forgone so much extra $$$ in my life by refusing to work for big cos and those that don't share the same values that I have. Not everyone has that same choice or awareness in life.
posted by xtine at 10:22 AM on February 16, 2018 [6 favorites]


yeah the same could be said about business schools needing more humanities classes

hello hello, your friendly neighbourhood humanities professor over here saying please yes god absolutely force more business majors into our classes, and please help us empower contingent faculty (which is most of us) to actually teach instead of just keeping our heads down while rubber-stamping non-majors through their required gen-ed coursework, so they can ignore our carefully chosen readings on media and ethics to, at times, literally work on their Stats homework while they're sitting in our classrooms because if we don't give them As they'll ruin us on the teacher version of Yelp and then we don't get our per-semester contracts renewed and we can't eat or pay our rent

(the business majors often end up in charge of our education system and it's just as broken as our social media for that reason)
posted by halation at 10:30 AM on February 16, 2018 [33 favorites]


but it also allows InCels (to pick a current example) all over the world to come together and feed off of each other's hate

What if I told you that hate groups exited before the rise of social media, nay, before the invention of the Internet?

Or, perhaps a more probing question, what causes the creation of, for instance, InCels, in the first place?

On balance, I think giving people the ability to communicate is a good thing. Yes, that means some bad people (of whatever definition) will communicate as well. They can be dealt with through specific countermeasures (e.g. ending harassment on Twitter), the issue is that social media companies are often incentivized not to implement those measures.

I often wonder what is behind this constant drumbeat of social media bashing articles. My tentative theory is that social media, for all its other virtues and faults, does give people a place to communicate to the world that they would not have otherwise. Naturally, that upsets those that would rather deny those people a voice.
posted by Noisy Pink Bubbles at 10:51 AM on February 16, 2018 [2 favorites]


What if I told you that hate groups exited before the rise of social media, nay, before the invention of the Internet?

I'd probably roll my eyes at your patronizing use of a Matrix meme.
posted by haileris23 at 11:03 AM on February 16, 2018 [22 favorites]


I don't want to get into a personal back-and-forth on this, although I'll say that I've personally seen a mob whipped up into a murderous frenzy (literally, not figuratively, murderous), so I'm very familiar with the idea that hate groups can exist without the internet. (In this case, a Hindu mob upset that some Muslims far away were apparently desecrating the ruins of a mythical temple.)

However, the point with social media is that it allows even the most marginal of positions to find a like-minded community to flourish in. As you say:

> I think giving people the ability to communicate is a good thing. Yes, that means some bad people (of whatever definition) will communicate as well.

Agreed on both counts - and I think the value of a knitters community (or Metafilter, for that matter) vastly outweighs the harm of a community of white supremacists.

> that upsets those that would rather deny those people a voice.

Here we part ways. I agree that I would rather deny white supremacists or Russian trolls a voice (so sue me). But I really don't want to cut off all social media to make that happen. If the price of connecting with long-lost school friends is that I have to allow communities I disapprove of to flourish as well, I'm reluctantly okay with that price - the benefit to me outweighs the cost. (Others may disagree - this is my opinion.)

But - this is the important bit - I don't think it has to be that way. We can do better. Right here on Metafilter, we've seen that we can do better as a community. I'd rather have that.
posted by RedOrGreen at 11:08 AM on February 16, 2018 [4 favorites]


What if I told you that hate groups exited before the rise of social media, nay, before the invention of the Internet?

I'd probably roll my eyes at your patronizing use of a Matrix meme.


What if I told you that phrase existed before the Matrix?

And that it's use might not be patronizing?

mind_blown.gif
posted by MikeKD at 11:28 AM on February 16, 2018 [3 favorites]


I don't hate the average engineer making $140K a year at 25, but as a non-tech person in the Bay Area, I will continue to resent the hell out of them. I'm even friends with some people like that (although no one that young), and they're lovely people, as individuals. But as a class, I resent the hell out of them, because all that difficulty that I know they experience, I know it's being passed on threefold to everyone else. So $140K/year is necessary to keep up with your mortgage? That's the case for everyone who wants a mortgage now, except most of us can't afford that. But there's always someone who can pay, and it's driving housing prices through the roof, as anyone in the area knows. People are leaving, cultures are changing, and it's harder and harder to get by if you're not part of the tech industry. There's a reason so many people resent the word "disrupt."

This is a very Bay-Area-specific resentment, but I think it gets at a bigger problem: that individually people can be wonderful, but as a class they can do a lot of damage. I know someone who works for one of the gig economy companies, and as a person she's lovely, but every contribution she makes to her company is a contribution to greater harm in the world. I'm not going to ask her to give up her career and everything she's worked for, and I can't blame her for taking the only feasible way to actually afford housing in SF, but that doesn't mean that her actions aren't still contributing to something deeply harmful. If we just think about it individually, nobody is doing all that much harm on their own, but it adds up. One person buying over market value for a condo on Lake Merritt in Oakland probably isn't that bad, but everyone is doing it, and rents in some areas have tripled.

I'm at Berkeley, and (thanks in part to the private students' Facebook groups) I have come to deeply resent the Electrical Engineering and Computer Science (EECS) students. That program is the #1 in the country, and those students are the absolute worst when it comes to publicly airing condescending attitudes towards everyone else. I quit every student Facebook group because I got tired of every single post being overrun by EECS students making "edgy" jokes about snowflakes and humanities. Obviously they're not all like that to a person, and I'm sure some of that is just youthful bravado - but isn't that the problem? Who is going to take responsibility for this shit?

I don't think problems like this exist because engineers need to take more humanities classes, but I also think individual engineers are more complicit than they want to think they are. I'm sympathetic to being stuck in a career with a lot of pressure to stuff you might not agree with, and I'm sympathetic to being painted with a broad brush unfairly. I don't mean anything as a condemnation, because like I said, I know wonderful people all over SV, but I'm also getting the sense that people don't think their individual contributions do that much harm, and I think they do. Look, I'm even a part of this just by being at an institution like Berkeley, and I don't know what the solution should be. But I feel like it has to start with honestly recognizing how we might be complicit with bad things.
posted by shapes that haunt the dusk at 11:41 AM on February 16, 2018 [14 favorites]


On second thought, maybe that comment says more about me than about software engineers. This is why I hate social media. Sure, it gives a voice to people, but it also makes it easy to just fire off some half-formed idea without giving it enough thought. There's so much incentive to just argue about stuff.
posted by shapes that haunt the dusk at 12:04 PM on February 16, 2018 [1 favorite]


Social media as a form of sharing and forming community can be provided at 100% by sites like Metafilter.

What FB, Twitter and Google (and video games with loot crates) are doing is weaponizing their applications in order to trigger a dopamine response that makes you stay longer on their application.

That really sucks IMO.
posted by Annika Cicada at 12:40 PM on February 16, 2018 [7 favorites]


Just to make sure the liberal firing squad fires in all directions -- we *know* that studying the humanities, even being great and subtle thinkers and writers about any fraction of human nature, is not sufficient to inoculate people against collusion with the worst of regimes. There was a whole sad genre of philosophy books written after World War II about this, winding up, more or less, with the discovery that some of the scholars who had been criticizing Nazis and colonialists had been Stasi collaborators.
posted by clew at 1:00 PM on February 16, 2018 [6 favorites]


I don't think problems like this exist because engineers need to take more humanities classes, but I also think individual engineers are more complicit than they want to think they are.

Individuals, full stop, are more complicit than they want to think they are. Complicity is not unlike privilege in that way. Even when you think you've dropped out, turns out you are just another cog in the machine.

On second thought, maybe that comment says more about me than about software engineers. This is why I hate social media. Sure, it gives a voice to people, but it also makes it easy to just fire off some half-formed idea without giving it enough thought. There's so much incentive to just argue about stuff.

Exactly :) Well.. except social media isn't just about arguing, it is also about conspicuous agreeing. There is also a powerful incentive to form tribes--MetaFilter has favourites after all. From my perspective, your comment looked as much like an affirmation of your tribal identity as simply an attempt to argue key points. Which is totally fine, until it becomes a fascistic clique. isn't..
posted by Chuckles at 1:25 PM on February 16, 2018 [4 favorites]


Anyway, back on topic, Enturbulated linked the Wired article a couple of days ago. That thread might be worth a browse for some.

There is also this fantastic interview with Chamath Palihapitiya:
I can control my decisions, which is I don't use the shit.
And I was just listening to In Our Time on Nietzsche, somehow it feels relevant here.
Is it time to have another MetaFilter tribal war? Yes, I'm an auditory learner. You readers can just go....
posted by Chuckles at 1:47 PM on February 16, 2018 [2 favorites]


Exactly :) Well.. except social media isn't just about arguing, it is also about conspicuous agreeing. There is also a powerful incentive to form tribes--MetaFilter has favourites after all. From my perspective, your comment looked as much like an affirmation of your tribal identity as simply an attempt to argue key points.

This here is why the concept of tribalism needs to go die in a fire. It is tiresome to see the content of positions dismissed, because it's easier to just say "oh, you hold those positions because of the group you belong to."
posted by NoxAeternum at 2:10 PM on February 16, 2018


I double-majored in CS and a humanities subject. Engineers can be myopic, but there were so, so many privileged kids in the humanities program - if you know you can't rely on the Bank of Mom and Dad, what are you going to study in school?
posted by airmail at 2:14 PM on February 16, 2018 [5 favorites]


I think chomsky’s Analysis of the decline of the “civic-minded elite”, aka people who had enough money to have some kind of life of leisure, who decided that giving a shit about a greater good was more important than being selfish dickbags, is very illustrative of why we are where we are at today.
posted by Annika Cicada at 3:27 PM on February 16, 2018 [5 favorites]


This engineering/humanities things seems like an essentially unprovable pretty major derail. Bit of a bummer to see the thread overtaken with so much on it.

I mentioned it on another thread, but I've repeatedly flaged racist groups, comments, or users on Facebook. Every single time I'm told that they don't violate community norms and will be staying, thanks very much.

Much like Twitter, etc. I cannot take these companies efforts seriously until they take racism and sexism seriously. If you want talk about social forces impacting these sites, look beyond which degree they went and consider the uniquely American preoccupation with free speech.

If my employer had employees on our internal social media saying things that regularly crop up on Facebook that employee would be counselled, the posts removed and depending on the severity they could well be fired. Hell we had an employee suggesting that "traditional christians" could be persecuted as a result of Australia's gay marriage vote and our employers' public support for a "yes" vote. That post was removed and the employee warned not to do it again. That was pretty milquetoast so far as homophobia goes.

I cannot imagine the depths you have to sink to be kicked off Facebook or Twitter. But Zuck etc are more concerned with tweaking the algorithm etc. Pay attention to the reports you're getting currently.
posted by smoke at 3:31 PM on February 16, 2018 [7 favorites]


Yeah, sorry for adding to the derail.

What are the algorithms? I read an article for a class a few years ago ("Can an Algorithm be Wrong?") about how Twitter had suppressed certain hashtags like #occupywallstreet and #TroyDavis over more banal stuff like #KimKardashian. Some topics were just filtered out because they tripped the wrong flags. At the time, we thought "they're looking for the most bland, broadly marketable material they can get!" but of course that changed in favor of whatever would get the most clicks.

We still talk about it like a simple set of rules (more clicks -> more exposure), but it must be much more complex than that. Someone, somewhere, has to link certain key words with certain concepts like "politics." I don't mean this as a rhetorical question: does anyone know what the Facebook algorithm actually looks like?
posted by shapes that haunt the dusk at 3:54 PM on February 16, 2018


Right here on Metafilter, we've seen that we can do better as a community.

If only Zuckerberg had collected $5.00 for each person who joined Facebook, he'd be rich now.
posted by ZenMasterThis at 4:31 PM on February 16, 2018 [2 favorites]


Just to make sure the liberal firing squad fires in all directions

Has anyone explicitly pointed out that Metafilter is social media and we've been part of the problem for 18 years now?
posted by Tell Me No Lies at 4:42 PM on February 16, 2018 [3 favorites]


when I was an undergrad, we had a required 'engineering ethics' class and a good portion of humanities credits. i've been sacked twice. both times for ethics reporting: software security holes, and outright defrauding the government. not sure i would do it again.
posted by j_curiouser at 6:20 PM on February 16, 2018 [3 favorites]


i've been sacked twice. both times for ethics reporting: software security holes, and outright defrauding the government. not sure i would do it again.

That sucks j_curiouser, fwiw you did the "right" thing.

But, as the saying goes; No good deed goes unpunished.
posted by some loser at 7:58 PM on February 16, 2018


Has anyone explicitly pointed out that Metafilter is social media and we've been part of the problem for 18 years now?

How can we be part of the problem, when we're so good at pointing out the problem, over and over again?

That's all we need to do, right?
posted by happyroach at 10:42 PM on February 16, 2018 [5 favorites]


This caught the eye...

...to give some context I once accidentally took the country of Slovenia offline for 3 days and no one particularly complained...

Tell Me No Lies - any chance of some more detail (and date) on this, please?
posted by Wordshore at 4:24 AM on February 17, 2018 [3 favorites]


any chance of some more detail (and date) on this, please?

Someone else asked about this too. Here ya go:

After the Internet escaped from DARPA, it was used almost entirely by universities. Commercial users were rare, and they were usually only in it for the email and file transfers. For a sense of the time, the first piece of commercial spam went out in 1994. This took place in 1993.

In a lot of smaller countries, universities covered their downlink costs by turning around and selling connectivity to businesses or whoever else wanted it. Thus universities became the first ISPs in those countries. In Slovenia, University of Ljubljana was the only connection to the internet.

I was working with them to fix a bug in their main router and sent them an image that (for whatever reason) hard crashed the thing. They had no spare, there were no backup links. Slovenia was offline until they got the new router we sent them and they configured it.

I asked them about it afterwards and they said that the phone system (everything in-country was modems on phone lines) was so bad that nobody depended on it for anything. They got some annoyed phone calls and that was it.

A simpler time for sure.
posted by Tell Me No Lies at 4:49 AM on February 17, 2018 [11 favorites]


Attacking engineers is nice and tidy, but maybe we should remember that the product people who dream this shit up aren't engineers?

We still talk about it like a simple set of rules (more clicks -> more exposure), but it must be much more complex than that. Someone, somewhere, has to link certain key words with certain concepts like "politics." I don't mean this as a rhetorical question: does anyone know what the Facebook algorithm actually looks like?

I have no knowledge of Facebook's algorithms, but, no, no one sits there by hand and links certain key words with certain concepts. You're going to "learn" (in the machine learning sense) those associations. Depending on precisely what you're doing, you may need to label some data by hand (or pay people to do it), but these things tend to go along the lines of "transform text to vector, take inner products, take maximum" not "if post contains 'Senate' do [some thing]". (I need to run out the door, but can elaborate.)
posted by hoyland at 9:36 AM on February 17, 2018 [2 favorites]


Delete if this was already posted, but Ezra Klein recently had an episode of his podcast with Jaron Lanier that talked about monetizing addiction and how it's making us all crazy. I'm sure Lanier has more work in that area if people are interested.
posted by Rainbo Vagrant at 12:45 PM on February 17, 2018 [1 favorite]


It's a long article, and I skimmed parts, so I don't blame anybody for not having read it. But while the contents definitely highlight some mistakes and problems with Facebook, the picture I'm seeing painted is also definitely one of a company genuinely grappling with the problems on a deeper level than some of discussion here reflects.

A lot of technology doesn't cause problems itself but it magnifies existing dynamics. I think Facebook's problems have a lot in common with those of traditional news media and some unsolved social problems there, and I think you see this in at least two ways in the article's long setup:

* Right out of the gate, the story about the contractor leaking the "What responsibility do we have to keep Trump from getting elected?" screenshot.... that's pretty much the story of the dynamic that every "mainstream" news outlet faces in its relationship with movement conservative readers, who are trained by their social circle (and possibly by their own conscientious temperament, if some of the psych research I've seen is to be believed) to be extra sensitive to charges of "bias" against the tribe, and of course it only takes one data point to confirm it. So, like NPR, partly out of idealism (we *can* create a truly neutral marketplace of ideals) and partly out of incentives to not alienate an audience, they work hard to try and tell people they're a neutral platform and even build their processes around it. Outside some margin, I'd guess it rarely works. The allegation of "bias" -- which is basically a verbalization of tribal responses -- is just too useful of a thought-shorthand and rhetorical club, even if it weren't steroid-fueled daily by the conservative political discussion machine. But that doesn't stop some people from trying, in some cases egged on by conservative media players who Lucy-dangle the football of potential trust as if it were something they were ever really going to extend.

* Consider this passage of the article: "Trump would post simple messages like “This election is being rigged by the media pushing false and unsubstantiated charges, and outright lies, in order to elect Crooked Hillary!” that got hundreds of thousands of likes, comments, and shares. The money rolled in. Clinton’s wonkier messages, meanwhile, resonated less on the platform." This also isn't a Facebook problem. This is a messaging problem for any media. People have been complaining that soundbites and even lies travel better for longer than I've been alive, and I'd bet that before that the complaint was about slogans and yellow journalism.

So while Facebook's initial failure to come to grips with the problems strikes me as naive, it doesn't strike me as being unusually hidebound. I mean, I've met humanities educated young journalists who essentially make the same sets of naive mistakes. And now they're paying close attention to the problems and hiring people from journalism who have wrestled with it, and no, they're still not doing everything right in my opinion, but I suspect they've taken the low-hanging fruit and everything else they might do involves tradeoffs that aren't trivial to balance and maybe other unintended consequences. And yet it sounds like they recognize the problems and are trying. Consider: "Alex Hardiman, the head of Facebook news products and an alum of The New York Times, started to recognize that Facebook had long helped to create an economic system that rewarded publishers for sensationalism, not accuracy or depth. " Or this from Andrew Anker: "This is a retreat from ‘Anything goes if it works with our algorithm to drive up engagement.’”

And then there's the center thesis of the article: Zuckerberg's bruised techno-optimism, which is some cause for schadenfreude, because the unrestrained techno-optimism has been annoying for even me as a software engineer. And it's cause for confidence, first step in fixing a lot of problems is (a) admitting that you have a problem and (b) understanding that the philosophy and perspectives you already have in inventory might not fit every situation. Maybe it took him longer to get there than it should have, but we might be in better shape for having the problem largely playing out under a roof or two. I think if we snap our fingers and FB disappears overnight, the knowledge/tech to weaponize finely differentiated advertisements, affinity groups, filter-bubbles, etc doesn't *disappear*, it just scatters, some of it into hands of people who care less than FB, some into the the hands of people who believe they have things to gain if they magnify the problems.

(Oh, and off topic, if there's anything new that *really* chills me in the whole article, it's the WTF of Facebook investigative staff having access to a contractor's GChats. Either this is not actually true, or Nuñez ratted Fearnow out, or Facebook obtained the contents of those communications from Google. FB's firing of staff who are leaking information about reasonable internal conversations is defensible (and I think keeping Trump from being President is a reasonable conversation), but I don't know how Google turning over user communications to a third party for any other reason than a law enforcement action would be defensible, and even there, I'm iffy. I might be done with my decade of making the tradeoffs in trusting Google for communications services. )
posted by wildblueyonder at 2:45 PM on February 17, 2018 [5 favorites]


Oh, and off topic, if there's anything new that *really* chills me in the whole article, it's the WTF of Facebook investigative staff having access to a contractor's GChats. Either this is not actually true, or Nuñez ratted Fearnow out, or Facebook obtained the contents of those communications from Google.

I feel that's a bit of a false dilemma; it's not like Facebook could obtain that info another way.
posted by MikeKD at 6:25 PM on February 17, 2018


This here is why the concept of tribalism needs to go die in a fire. It is tiresome to see the content of positions dismissed, because it's easier to just say "oh, you hold those positions because of the group you belong to."

But it demonstrably the case that for many people, at least some of the time, they do hold the positions they do because of the group they belong to. Numerous peer-reviewed studies have found this to be the case (example: Cohen 2003). Anecdotally, I remember a friend of mine in high school who was one of the few vocal Republicans in our AP Government class. During classroom discussions of current events and policies, she would frequently espouse what was essentially the Democratic party line on an issue, and then when informed of the political alignment of the position she'd just given, she'd immediately change her position to match her party identity.

The fact that "tribalism" is a real phenomenon that explains people's beliefs and behaviors doesn't mean that the content of those beliefs isn't also important, or that people's beliefs are solely governed by their social identities. But it's a mistake to think that identity plays no role in creating people's belief systems.
posted by biogeo at 9:12 AM on February 18, 2018 [1 favorite]


The fact that "tribalism" is a real phenomenon that explains people's beliefs and behaviors doesn't mean that the content of those beliefs isn't also important, or that people's beliefs are solely governed by their social identities. But it's a mistake to think that identity plays no role in creating people's belief systems.

Well no, I do believe that identity drives belief as much as the other way around. My comment was more about the popular usage of "tribalism" in popular discourse, which winds up being less about the conflation of identity and belief, and more about dismissing genuine difference of opinion on a topic through an assertion of the matter just being a conflict between two groups (with the argument often making a tacit assertion that both groups are more or less equally valid/invalid), as a sort of intellectual "justification" for riding a fencepost. That's the part about tribalism that needs to go die in a fire.
posted by NoxAeternum at 2:24 PM on February 18, 2018 [1 favorite]


“JUST AN ASS-BACKWARD TECH COMPANY”: HOW TWITTER LOST THE INTERNET WAR
At the same time, her defenders say, Harvey has been forced to clean up a mess that Twitter should have fixed years ago. Twitter’s backend was initially built on Ruby on Rails, a rudimentary web-application framework that made it nearly impossible to find a technical solution to the harassment problem. If Twitter’s co-founders had known what it would become, a third former executive told me, “you never would have built it on a Fisher-Price infrastructure.” Instead of building a product that could scale alongside the platform, former employees say, Twitter papered over its problems by hiring more moderators. “Because this is just an ass-backward tech company, let’s throw non-scalable, low-tech solutions on top of this low-tech, non-scalable problem.”

Calls to rethink that approach were ignored by senior executives, according to people familiar with the situation. “There was no real sense of urgency,” the former executive explained, pointing the finger at Harvey’s superiors, including current C.E.O. Jack Dorsey. “It’s a technology company with crappy technologists, a revolving door of product heads and C.E.O.s, and no real core of technological innovation. You had Del saying, ‘Trolls are going to be a problem. We will need a technological solution for this.’” But Twitter never developed a product sophisticated enough to automatically deal with with bots, spam, or abuse. “You had this unsophisticated human army with no real scalable platform to plug into. You fast forward, and it was like, ‘Hey, shouldn’t we just have basic rules in place where if the suggestion is to suspend an account of a verified person, there should be a process in place to have a flag for additional review, or something?’ You’d think it would take, like, one line of code to fix that problem. And the classic response is, ‘That’s on our product road map two quarters from now.’”
posted by the man of twists and turns at 12:11 PM on February 20, 2018 [1 favorite]


Twitter’s backend was initially built on Ruby on Rails, a rudimentary web-application framework that made it nearly impossible to find a technical solution to the harassment problem.

I actively dislike Rails, but this sentence makes no sense. Twitter's a mess, but I don't think Rails has much to do with it.
posted by hoyland at 5:25 AM on February 23, 2018


« Older you come across a collapsed mall from the Times...   |   Hello, human person Newer »


This thread has been archived and is closed to new comments