Your Facebook Algorithm Is Screwing With You


Like many other New Yorkers, I went to my polling station early Tuesday morning and stood in line to vote with my husband and my daughter, who celebrated her first birthday on Election Day. We cast our ballots for Hillary Clinton, received our “I voted!” stickers, and left with the excitement that we were helping elect — at least symbolically because we live in very blue state — our first woman president.

But, Wednesday morning I found myself grieving, and not just because Trump won. I grieved for the fact that my information streams — Facebook, Twitter, Instagram — gave me false hope and a false sense of comfort that she would win. And they had, for months. All day, because it was Election Day, I checked in and saw friends posting voting selfies, sneaking ballot photos at their polling sites, talking about the privilege of voting in our democracy, of photos of Hillary over her career in politics paired with inspiration quotes. I saw Susan B. Anthony and the suffragettes, babies in The Future Is Female T-shirts, feminist fathers embracing their daughters and promising them tomorrow a woman would lead the free world, and hopeful messages of triumph over misogyny, racism, ignorance and hate. I saw 100% Hillary. Or at least 99%.

How did this happen?


The “mainstream media” has received the bulk of the critique for catalyzing Trump’s message — that despite their takedowns they’ve been megaphones, for paying endless attention to him, for failing to confront fabrications with facts. But, going back to my grief on Wednesday morning: I grieved for feeling like I was caught off guard. For the hope I had and felt was affirmed by everyone around me. By the overwhelming critical mass on Instagram and Facebook and Twitter who all seemed to be agreeing: we’re going to win. I can’t tell you how many times I’ve heard in the aftermath, “I just wasn’t prepared for the fact that she could lose.”

For this, I ask: What about the responsibility of the platforms that serve up that media and that information? There are 1.79 billion people on Facebook, 317 million people on Twitter, and over 500 million people on Instagram. These platforms are global communities offering a chance for diverse people from all over the world, at all socioeconomic tiers, holding political views on all parts of the spectrum, to engage with one another. This is a profound opportunity to allow people with disparate opinions to connect and communicate, to dialogue in public.

As far as I can tell, that’s not happening. Instead, we see reflections of our own views. Content you share is met with suggested content that is similar. Photos and words from your friends are retweeted, reposted and requoted by friends and friends-of-friends and friends-of-friends-of-friends. You can “<3” or “thumbs up” something, but not thumbs down. These are platforms of approval. The more you are the same, the better.

As a person who has worked in tech for over a decade, I understand that the more you can group similar people, the easier it is to sell ads. This is because it’s easier to target them with products and services that their group will buy into. Those ad units are more lucrative if you can promise your clients that they are reaching a specific group, like a white conservative midwesterner women versus liberal retirees in Santa Cruz. I work at a company that has occasionally run Facebook ads, and have friends who work at Facebook, so I know how robust the targeting is and how complex the technology.


Data science and advertising technology are relatively young. How we use these tools — to make predictions about election results and about human behavior, to further our revenue streams and our businesses — have profound implications. As Facebook, Instagram, Twitter and other platforms continue to tweak their algorithms, they often further the agenda of generating revenue. And that is great — for them.

But, this also means they are serving like to like and furthering the hall of mirrors. Though these consequences may be unintended, this means we will continue to exist in artificially walled gardens, where we can feel comfortable and validated because our opinions and photos and lives seem perfectly in line with others. And as these election results show, completely disconnected from reality.

Let us hold the technology we use responsible, and ask the people who make decisions about these communities (looking at you: Mark Zuckerberg and Evan Williams!) to see the social responsibility of enabling dialogue between people with views and opinions that are different from ours. Platforms are a means to develop diverse communities, more empathy and experience a more holistic picture of the national and global dialogue, so we can take action accordingly.

Youngna Park is the COO at Tinybop, a startup making educational apps for kids. Follow her on Twitter @youngna and Instagram @youngnapark.

Illustration by Emily Zirimis.


More from Archive