The crazy pendulum swings of centralized moderation by dominant social media platforms is all over the news again, as nicely summarized by Will Oremus, and explored by a stellar Lawfare panel of experts.
We have seen a swing toward what many perceive as blunt over-moderation and censorship in 2016-17, and now a swing away, to what others view as irresponsibly enabling uncontrolled cesspools of anger, hate, and worse. This pendulum is clearly driven in large part by the political winds (which it influences, in turn), a question of whose ox gets gored, and who has the power to influence the platforms -- "Free speech for me, but not for thee."
This will remain a crazy pendulum -- and one that can destroy the human community and its collective intelligence -- until we step back and take a smarter approach to context and diversity of our perceptions of speech. Shifting toward community moderation, as X/Twitter and Meta/Facebook/Threads are now doing, may point in the right direction: to democratize that control -- but is done with only a minimal and flawed hint of real democracy. Even if they really try, centralized platforms are inherently incapable of doing that well. Here are some quick notes on why, and how to do better.
Three of the speakers on the Lawfare panel were coauthors/contributors with me in a comprehensive white paper, based on a symposium on a partially decentralized approach called "middleware." That proposes an open market in independent curation and moderation services that sit in the middle between the user and the platforms. These services can do community-based moderation is a full range of ways, at a community level, much more like the way traditional communities have always done "moderation" (better thought of as "mediation") of how we communicate with others. This new middleware paper explains the basics, why it is a promising solution, and how to make it happen. (For a real-world example of middleware, but still in its infancy, consider Bluesky.)
As for the current platform approach to "community moderation," many have critiqued it, but I suggest a deeper way to think about this, more in line with how humans have always mediated their speech. Three Pillars of Human Discourse (and How Social Media Middleware Can Support All Three is a recent piece on extending current ideas on middleware to support this solution that has evolved over centuries of human society.
Specific to community moderation
The Augmented Wisdom of Crowds: Rate the Raters and Weight the Ratings (from 2018) digs deeper into why simplistic attempts at community surveys fail, and how the same kind of advanced analysis of human inputs that made Google win the search engine wars can be applied to social media. A 2021 post updates that.
To understand why this is important, consider what I call The Zagat Olive Garden Problem. In the early 90s, I noticed this in the popular community-rating service Zagat published on restaurants: The top 10 or so restaurants in NYC were almost all high-priced, haute cuisine or comparably refined, but one was Olive Garden. Because Olive Garden food was just as good? No, because far more people knew it from their many locations, and far more were attracted to a familiar brand with reasonably tasty food at modest prices.
Doing surveys where all votes are counted equally sounds democratic, but is foolishly so. We really want ratings from those with a reputation for tastes we relate to and trust -- but leavened with healthy diversity on how we should broaden our horizons. That is what good feed and recommender algorithms can do. We need to "rate the raters and weight the ratings."
Back to the pendulum model, consider how pendulums work -- especially the phenomenon of entrainment. Glossing over many details...
- Back in 1666, Huygens invented the pendulum clock and discovered that if two were mounted on the same wall, their pendulum swings gradually became synchronized. That is because each interacts with the shared wall to exchange energy in a way that brings them into phase.
- Simplistically, moderation is a pendulum that can swing from false positives to false negatives. Each platform has one big pendulum controlled by one person/corporation that swings with the political wind (or other platform influences). Platform-level community moderation entrains everyone to that one pendulum, whether it fits or not -- many false positives and false negatives, often biased one way or the other.
- A distributed system of middleware services can serve many individuals or communities, each with their own pendulums that swing to their own tastes.
- Within communities, these pendulums are linked (the shared wall) and tend to entrain.
- Across communities, there are also weaker linkages, in different dimensions, so still a nudge toward entrainment.
- In addition to these linkages in many dimensions, instead of being rigid, the walls of human connection are somewhat elastic in how they entrain.
- The Google PageRank algorithm is based on advanced math (eigenvalues) that treats individual search engine users as clustering into diverse communities of interest and value -- much like a network of pendulums all linked to one another by elastic "walls" in a multidimensional array.
- Similar algorithms can be used by diverse middleware services to distill community ratings with the same nuanced sensitivity to their diverse community contexts. Not perfectly, but far better than any centralized system.
No comments:
Post a Comment