Oh, my piece for the Institute of Network Cultures got published and I missed it!

Fighting Disinformation: We’re Solving The Wrong Problems
networkcultures.org/tactical-m

> The reason why misinformation and disinformation spread so fast is that our most commonly used communication tools had been built in a way that promotes that kind of content over fact-checked, long-form, nuanced reporting.

> One could call this the “outrage dividend“, and disinformation benefits especially handsomely from it.

> According to Washington Post, “Facebook programmed the algorithm that decides what people see in their news feeds to use the reaction emoji as signals to push more emotional and provocative content — including content likely to make them angry.”

> I am not saying Facebook intentionally designed its platform to become the best tool a malicious disinformation actor could dream of. This might have been an innocent mistake, an unintended consequence of how the post-promoting algorithm works.

> But in large systems, even tiny mistakes compound to become huge problems, especially over time. And Facebook happens to be a gigantic system that has been with us for almost two decades.

> In the immortal words of fictional Senator Soaper: “To err is human, but to really foul things up you need a computer.”

Pretty happy with that piece, tbh.

· · Web · 0 · 1 · 9
Sign in to participate in the conversation
Mastodon for Tech Folks

mastodon.technology is shutting down by the end of 2022. Please migrate your data immediately. This Mastodon instance is for people interested in technology. Discussions aren't limited to technology, because tech folks shouldn't be limited to technology either!