Загрузка...

Tech Story: Social media

# THE SOCIAL MEDIA AUTOPSY

Two billion people check social media every single day. They share their memories, build friendships, organize communities. But here's what happened: the same platforms designed to connect us became weapons for spreading lies faster than truth. In 2023, journalist Maria Ressa called social media "toxic sludge" — not as opinion, but as diagnosis. This is how we built a machine that was supposed to set us free, then watched it weaponize our own attention against us.

Start in 1960, at the University of Illinois. A computer system called PLATO went online. It wasn't a household name, but it did something radical for its time: it let people communicate. It had a feature called Notes — essentially the first digital bulletin board. This was the DNA. The idea wasn't new. What was new was the *scale*.

Jump to 2004. Mark Zuckerberg launches Facebook from a Harvard dorm. He wasn't inventing social connection — he was automating it. Within two years, Facebook hits 12 million users. By 2008, it's 100 million. Twitter launches in 2006 and instantly becomes the town square of the internet. Instagram, Pinterest, Snapchat follow. Each one understood the same mechanic: if you give people the tools to broadcast themselves, they will use them obsessively.

Here's the mechanism they all deployed: user-generated content. A person posts. Their friends see it. Their friends' friends see it through shares and comments. The platform doesn't create the content — *you do*. The company just owns the pipe. This was the genius. Traditional media — newspapers, TV, radio — operated on a one-to-many model. One broadcaster. Many listeners. Social media inverted it. Many creators. Many listeners. Everyone's both.

But here's where the math breaks: in a traditional newspaper, editors decided what ran. They had skin in the game. Social platforms delegated that power to algorithms. An algorithm is a set of rules designed to show you more of what keeps you watching. Facebook's algorithm learned in real-time. The more you clicked, the more it learned. The more you stayed, the more ads it could sell.

Then came the discovery that changed everything. In 2016, researchers started measuring what the algorithm actually prioritized. It wasn't accuracy. It wasn't importance. It was *engagement*. Angry posts got clicked more than happy ones. Divisive posts more than unifying ones. The algorithm had no moral instruction. It was a machine that learned: *anger works*.

By 2018, the numbers were impossible to ignore. During the 2016 U.S. election, a tiny group of "superspreaders" — accounts deliberately spreading false information — reached millions through social platforms. False posts about voting locations. About candidate positions. About which groups of people were dangerous. The platforms didn't create the lies, but their mechanics amplified them. One lie, shared by one thousand accounts, reaches fifty million people. Traditional media couldn't do that if they tried.

Here's what changes how you see this: the platforms *knew*. In internal documents that emerged years later, Facebook researchers documented exactly what was happening. A 2018 internal research paper found that their algorithm was driving polarization. Not as a side effect — as the core output. The algorithm learned to show you content that made you angrier at people different from you, because that anger kept you scrolling.

But they didn't stop. They didn't redesign. They didn't decelerate. Why? Because engagement metrics directly determined company value. Facebook's stock price was tied to daily active users. Algorithmic amplification of divisive content increased engagement. Higher engagement meant higher stock price. The problem wasn't that Facebook executives were cartoonishly evil — it's that the *incentive structure* made the toxic outcome the profitable outcome.

The real autopsy moment: social media didn't fail. It succeeded *exactly as designed*. It connected billions of people. It amplified content at unprecedented scale. It monetized attention. The systems worked flawlessly. But "working" and "healthy" turned out to be different things entirely.

The lesson isn't about social media specifically. It's about what happens when you optimize for one metric and ignore everything else. The platforms optimized for engagement. They achieved it. They got engagement. What they didn't optimize for — accuracy, social cohesion, mental health, democracy — didn't get built. Not because the engineers were incompetent. Because the business model had no reason to care.

This pattern repeats. Kodak invented the digital camera and couldn't sell it because film was their revenue. Boeing prioritized spee

#Tech #AI #Technology #TechNews #Innovation

Tech Postmortem
🔔 Subscribe for daily tech updates!

Видео Tech Story: Social media канала Tech Postmortem
Яндекс.Метрика
Все заметки Новая заметка Страницу в заметки
Страницу в закладки Мои закладки
На информационно-развлекательном портале SALDA.WS применяются cookie-файлы. Нажимая кнопку Принять, вы подтверждаете свое согласие на их использование.
О CookiesНапомнить позжеПринять