Skip to main content

Dear Facebook, This is How You’re Breaking Democracy: A Former Facebook Insider Explains How the Platform’s Algorithms Polarize Our Society

Is this what we want? A post-truth world where toxicity and tribalism trump bridge building and consensus seeking? —Yaël Eisenstat

It’s an increasingly familiar occurrence.

A friend you’ve enjoyed reconnecting with in the digital realm makes a dramatic announcement on their social media page. They’re deleting their Facebook account within the next 24 hours, so shoot them a PM with your email if you’d like to stay in touch.

Such decisions used to be spurred by the desire to get more done or return to neglected pastimes such as reading, painting, and going for long unconnected nature walks.

These announcements could induce equal parts guilt and anxiety in those of us who depend on social media to get the word out about our low-budget creative projects, though being prone to Internet addiction, we were nearly as likely to be the one making the announcement.

For many, the break was temporary. More of a social media fast, a chance to reevaluate, rest, recharge, and ultimately return.

Legitimate concerns were also raised with regard to privacy. Who’s on the receiving end of all the sensitive information we’re offering up? What are they doing with it? Is someone listening in?

But in this election year, the decision to quit Facebook is apt to be driven by the very real fear that democracy as we know it is at stake.

Former CIA analyst, foreign service officer, andfor six monthsFacebook’s Global Head of Elections Integrity Ops for political advertising, Yaël Eisenstat, addresses these preoccupations in her TED Talk, "Dear Facebook, This is How You're Breaking Democracy," above.

Eisenstat contrasts the civility of her past face-to-face ”hearts and minds”-based engagements with suspected terrorists and anti-Western clerics to the polarization and culture of hatred that Facebook’s algorithms foment.

As many users have come to suspect, Facebook rewards inflammatory content with amplification. Truth does not factor into the equation, nor does sincerity of message or messenger.

Lies are more engaging online than truth. As long as [social media] algorithms' goals are to keep us engaged, they will feed us the poison that plays to our worst instincts and human weaknesses.

Eisenstat, who has valued the ease with which Facebook allows her to maintain relationships with far-flung friends, found herself effectively demoted on her second day at the social media giant, her title revised, and her access to high level meetings revoked. Her hiring appears to have been purely ornamental, a palliative ruse in response to mounting public concern.

As she remarked in an interview with The Guardian’s Ian Tucker earlier this summer:

They are making all sorts of reactive changes around the margins of the issues, [to suggest] that they are taking things seriously – such as building an ad library or verifying that political advertisers reside in the country in which they advertising – things they should have been doing already. But they were never going to make the fundamental changes that address the key systemic issues that make Facebook ripe for manipulation, viral misinformation and other ways that the platform can be used to affect democracy.

In the same interview she asserted that Facebook’s recently implemented oversight board is little more than an interesting theory that will never result in the total overhaul of its business model:

First of all, it’s another example of Facebook putting responsibility on someone else. The oversight board does not have any authority to actually address any of the policies that Facebook writes and enforces, or the underlying systemic issues that make the platform absolutely rife for disinformation and all sorts of bad behaviour and manipulation.

The second issue is: it’s basically an appeal process for content that was already taken down. The bigger question is the content that remains up. Third, they are not even going to be operational until late fall and, for a company that claims to move fast and break things, that’s absurd.

Nine minutes into her TED Talk, she offers concrete suggestions for things the Facebook brass could do if it was truly serious about implementing reform:

  • Stop amplifying and recommending disinformation and bias-based hatred, no matter who is behind itfrom conspiracy theorists to our current president.
  • Discontinue personalization techniques that don’t differentiate between targeted political content and targeted ads for athletic footwear.
  • Retrain algorithms to focus on a metrics beyond what users click or linger on.
  • Implement safety features that would ensure that sensitive content is reviewed before it is allowed to go viral.

Hopefully viewers are not feeling maxed out on contacting their representatives, as government enforcement is Eisenstat’s only prescription for getting Facebook to alter its product and profit model. And that will require sustained civic engagement.

She supplements her TED Talk with recommendations for artificial intelligence engineer Guillaume Chaslot’s insider perspective op-ed "The Toxic Potential of YouTube’s Feedback Loop" and The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think by MoveOn.org's former Executive Director, Eli Pariser.

Your clued-in Facebook friends have no doubt already pointed you to the documentary The Social Dilemma.

Read the transcript of Yaël Eisenstat’s TED Talk here.

Related Content: 

The Problem with Facebook: “It’s Keeping Things From You”

The Case for Deleting Your Social Media Accounts & Doing Valuable “Deep Work” Instead, According to Computer Scientist Cal Newport

This Is Your Kids’ Brains on Internet Algorithms: A Chilling Case Study Shows What’s Wrong with the Internet Today

Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine. Follow her @AyunHalliday.

Dear Facebook, This is How You’re Breaking Democracy: A Former Facebook Insider Explains How the Platform’s Algorithms Polarize Our Society is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooksFree Audio Books, Free Foreign Language Lessons, and MOOCs.



from Open Culture https://ift.tt/3ni1HZn
via Ilumina

Comments

Popular posts from this blog

Board Game Ideology — Pretty Much Pop: A Culture Podcast #108

https://podtrac.com/pts/redirect.mp3/traffic.libsyn.com/secure/partiallyexaminedlife/PMP_108_10-7-21.mp3 As board games are becoming increasingly popular with adults, we ask: What’s the relationship between a board game’s mechanics and its narrative? Does the “message” of a board game matter? Your host Mark Linsenmayer is joined by game designer Tommy Maranges , educator Michelle Parrinello-Cason , and ex-philosopher Al Baker to talk about re-skinning games, designing player experiences, play styles, game complexity, and more. Some of the games we mention include Puerto Rico, Monopoly, Settlers of Catan, Sorry, Munchkin, Sushi Go, Welcome To…, Codenames, Pandemic, Occam Horror, Terra Mystica, chess, Ticket to Ride, Splendor, Photosynthesis, Spirit Island, Escape from the Dark Castle, and Wingspan. Some articles that fed our discussion included: “ The Board Games That Ask You to Reenact Colonialism ” by Luke Winkie “ Board Games Are Getting Really, Really Popular ” by Darron Cu

How Led Zeppelin Stole Their Way to Fame and Fortune

When Bob Dylan released his 2001 album  Love and Theft , he lifted the title from a  book of the same name by Eric Lott , who studied 19th century American popular music’s musical thefts and contemptuous impersonations. The ambivalence in the title was there, too: musicians of all colors routinely and lovingly stole from each other while developing the jazz and blues traditions that grew into rock and roll. When British invasion bands introduced their version of the blues, it only seemed natural that they would continue the tradition, picking up riffs, licks, and lyrics where they found them, and getting a little slippery about the origins of songs. This was, after all, the music’s history. In truth, most UK blues rockers who picked up other people’s songs changed them completely or credited their authors when it came time to make records. This may not have been tradition but it was ethical business practice. Fans of Led Zeppelin, on the other hand, now listen to their music wi

Moral Philosophy on TV? Pretty Much Pop #32 Judges The Good Place

http://podtrac.com/pts/redirect.mp3/traffic.libsyn.com/partiallyexaminedlife/PMP_032_2-3-20.mp3 Mark Linsenmayer, Erica Spyres, and Brian Hirt discuss Michael Schur's NBC TV show . Is it good? (Yes, or we wouldn't be covering it?) Is it actually a sit-com? Does it effectively teach philosophy? What did having actual philosophers on the staff (after season one) contribute, and was that enough? We talk TV finales, the dramatic impact of the show's convoluted structure, the puzzle of heaven being death, and more. Here are a few articles to get you warmed up: "The Good Place’s Final Twist" by Karthryn VanArendonk "The Good Place Was a Metaphor All Along" by Sophie Gilbert "The Two Philosophers Who Cameoed in the Good Place Finale on What They Made of Its Ending" by Sam Adams "5 Moral Philosophy Concepts Featured on The Good Place" by Ellen Gutoskey If you like the show, you should also check out The Official Good Place Podca