Carried links in from word, but clear formatting mangled them
A moonshot is what struck me, after some reflection on the
afternoon’s dialog at the Tech Policy Press mini-symposium, Reconciling
Social Media & Democracy. It was crystalized by a tweet later that
evening about “an
optimistic note.” My optimism that there is a path to a much better future was
reinforced, but so was my sense of the weight of the task.
Key advocates now see the outlines of a remedial program, and
many are now united in calling for reform. But the task is unlikely to be
undertaken voluntarily by the platforms -- and is far too complex, laborious, and
uncertain to be effectively managed by legislation or existing regulatory
bodies. There seemed to be general agreement on an array of measures as promising
-- despite considerable divergence on details and priorities. The clearest consensus
was that a new, specialized, expert agency is needed to work with and guide the
industry to serve users and society.
While many of the remedies have been widely discussed, the focal
point was a less-known strategy arising from several
sources and recently given prominence by Francis Fukuyama and his Stanford-based
group. The highly respected Journal of Democracy featured an article
by Fukuyama, then a debate by
other scholars plus Fukuyama’s response. Our event featured
Fukuyama and most of those other debaters, plus several notable technology-focused
experts. I moderated the opening segment with Fukuyama and two of the other scholars,
drawing on my five-decade perspective on the
evolution of social media to try to step back and suggest a long-term guiding vision.
A moonshot is what struck me, after some reflection on the
afternoon’s dialog at the Tech Policy Press mini-symposium, Reconciling Social Media & Democracy on 10/7/21.
It was crystalized by a tweet later that evening about “an optimistic note.” My optimism that there is a
path to a much better future was reinforced, but so was my sense of the weight
of the task.
Key advocates now see the outlines of a remedial program,
and many are now united in calling for reform. But the task is unlikely to be
undertaken voluntarily by the platforms -- and is far too complex, laborious,
and uncertain to be effectively managed by legislation or existing regulatory
bodies. There seemed to be general agreement on an array of measures as
promising -- despite considerable divergence on details and priorities. The
clearest consensus was that a new, specialized, expert agency is needed to work
with and guide the industry to serve users and society.
While many of the remedies have been widely discussed, the
focal point was a less-known strategy arising from several sources and recently given prominence by
Francis Fukuyama and his Stanford-based group. The highly respected Journal of Democracy featured an article by Fukuyama, then a debate
by other scholars plus Fukuyama’s response. Our event featured Fukuyama and most of those other
debaters, plus several notable technology-focused experts. I moderated the
opening segment with Fukuyama and two of the other scholars, drawing on my
five-decade perspective on the evolution of social media to
try to step back and suggest a long-term guiding vision.
The core proposal is to unbundle the filtering of items in
our newsfeeds, creating an open market in filtering services (“middleware”) that users can choose from to
work as their agents. The idea is 1) to reduce the power of the platforms to
control for each of us what we see, and 2) to decouple that from the harmful
effects of engagement-driven business incentives that favor shock, anger, and
divisiveness. That unbundling is argued to be the only strategy that limits
unaccountable platform power over what individuals see, as a “loaded gun on the
table” that could be picked up by an authoritarian platform or government to
threaten the very foundations of democracy.
Key alternatives, favored by some, are the more familiar
remedies of shifting from extractive, engagement-driven, advertising-based
business models; stronger requirements for effective moderation and
transparency; and corporate governance reforms. These too have weaknesses:
moderation is very hard to do well no matter what, and
government enforcement of content-based moderation standards would likely fail
First Amendment challenges.
Some of the speakers are proponents of even greater
decentralization. My opening comments suggested that be viewed as a likely
long-term direction, and that the unbundling of filters was an urgent first
step toward a much richer blend of centralized and decentralized services and
controls -- including greater user control and more granular competitive
options.
There was general agreement by most speakers that there is
no silver bullet, and that most of these remedies are needed at some level as
part of a holistic solution. There were concerns whether the unbundling of
filters would do enough to stop harmful content or filter bubble echo chambers,
but general agreement that shifting power from the platforms was important. The
recent Facebook Files and hearings make it all too clear that platform
self-regulation cannot be relied on and that all but the most innocuous efforts
at regulation will be resisted or subverted. My suggested long-term direction
of richer decentralization seemed to generate little disagreement.
This dialog may help bring more coherence to this space, but
the deeper concern is just how hard reform will be. There seemed to be full
agreement on the urgent need for a new Digital Regulatory Agency with new
powers to draw on expertise from government, industry, and academia to regulate
and monitor with an ongoing and evolving discipline (and that current proposals
to expand the FTC role are too limited).
The Facebook Files and recent whistleblower testimony may
have stirred us to action (or not?), but we need a whole of society effort. We
see the outlines of the direction through a thicket of complex issues, but
cannot predict just where it will lead.
That makes us all uncomfortable.
That is why this is much like the Apollo moonshot. Both are
concerted attacks on unsolved, high-risk problems -- taking time, courage,
dedication, multidisciplinary government/industry organization, massive
financial and manpower resources, and navigation through a perilous and
evolving course of trial and error.
But this problem of social media is far more consequential
than the moonshot. “The lamps are going out all over the free world, and we
shall not see them lit again in our lifetime” (paraphrasing Sir Edward Grey as
the First World War began) -- this could apply within a very few years. We face
the birthing of the next stage of democracy -- much as after Gutenberg,
industrialization, and mass media. No one said this would be easy, and our
neglect over the past two decades has made it much harder. It is not enough to
sound alarms – or to ride off in ill-conceived directions. But there is reason
to be optimistic -- if we are serious about getting our act together.
Running updates and
additional commentary on these important issues can be found on Reisman’s blog.
A moonshot is what struck me, after some reflection on the afternoon’s dialog at the Tech Policy Press mini-symposium, Reconciling Social Media & Democracy. It was crystalized by a tweet later that evening about “an optimistic note.” My optimism that there is a path to a much better future was reinforced, but so was my sense of the weight of the task.
Key advocates now see the outlines of a remedial program, and many are now united in calling for reform. But the task is unlikely to be undertaken voluntarily by the platforms -- and is far too complex, laborious, and uncertain to be effectively managed by legislation or existing regulatory bodies. There seemed to be general agreement on an array of measures as promising -- despite considerable divergence on details and priorities. The clearest consensus was that a new, specialized, expert agency is needed to work with and guide the industry to serve users and society.
While many of the remedies have been widely discussed, the focal point was a less-known strategy arising from several sources and recently given prominence by Francis Fukuyama and his Stanford-based group. The highly respected Journal of Democracy featured an article by Fukuyama, then a debate by other scholars plus Fukuyama’s response. Our event featured Fukuyama and most of those other debaters, plus several notable technology-focused experts. I moderated the opening segment with Fukuyama and two of the other scholars, drawing on my five-decade perspective on the evolution of social media to try to step back and suggest a long-term guiding vision.
A moonshot is what struck me, after some reflection on the afternoon’s dialog at the Tech Policy Press mini-symposium, Reconciling Social Media & Democracy on 10/7/21. It was crystalized by a tweet later that evening about “an optimistic note.” My optimism that there is a path to a much better future was reinforced, but so was my sense of the weight of the task.
Key advocates now see the outlines of a remedial program, and many are now united in calling for reform. But the task is unlikely to be undertaken voluntarily by the platforms -- and is far too complex, laborious, and uncertain to be effectively managed by legislation or existing regulatory bodies. There seemed to be general agreement on an array of measures as promising -- despite considerable divergence on details and priorities. The clearest consensus was that a new, specialized, expert agency is needed to work with and guide the industry to serve users and society.
While many of the remedies have been widely discussed, the focal point was a less-known strategy arising from several sources and recently given prominence by Francis Fukuyama and his Stanford-based group. The highly respected Journal of Democracy featured an article by Fukuyama, then a debate by other scholars plus Fukuyama’s response. Our event featured Fukuyama and most of those other debaters, plus several notable technology-focused experts. I moderated the opening segment with Fukuyama and two of the other scholars, drawing on my five-decade perspective on the evolution of social media to try to step back and suggest a long-term guiding vision.
The core proposal is to unbundle the filtering of items in our newsfeeds, creating an open market in filtering services (“middleware”) that users can choose from to work as their agents. The idea is 1) to reduce the power of the platforms to control for each of us what we see, and 2) to decouple that from the harmful effects of engagement-driven business incentives that favor shock, anger, and divisiveness. That unbundling is argued to be the only strategy that limits unaccountable platform power over what individuals see, as a “loaded gun on the table” that could be picked up by an authoritarian platform or government to threaten the very foundations of democracy.
Key alternatives, favored by some, are the more familiar remedies of shifting from extractive, engagement-driven, advertising-based business models; stronger requirements for effective moderation and transparency; and corporate governance reforms. These too have weaknesses: moderation is very hard to do well no matter what, and government enforcement of content-based moderation standards would likely fail First Amendment challenges.
Some of the speakers are proponents of even greater decentralization. My opening comments suggested that be viewed as a likely long-term direction, and that the unbundling of filters was an urgent first step toward a much richer blend of centralized and decentralized services and controls -- including greater user control and more granular competitive options.
There was general agreement by most speakers that there is no silver bullet, and that most of these remedies are needed at some level as part of a holistic solution. There were concerns whether the unbundling of filters would do enough to stop harmful content or filter bubble echo chambers, but general agreement that shifting power from the platforms was important. The recent Facebook Files and hearings make it all too clear that platform self-regulation cannot be relied on and that all but the most innocuous efforts at regulation will be resisted or subverted. My suggested long-term direction of richer decentralization seemed to generate little disagreement.
This dialog may help bring more coherence to this space, but the deeper concern is just how hard reform will be. There seemed to be full agreement on the urgent need for a new Digital Regulatory Agency with new powers to draw on expertise from government, industry, and academia to regulate and monitor with an ongoing and evolving discipline (and that current proposals to expand the FTC role are too limited).
The Facebook Files and recent whistleblower testimony may have stirred us to action (or not?), but we need a whole of society effort. We see the outlines of the direction through a thicket of complex issues, but cannot predict just where it will lead. That makes us all uncomfortable.
That is why this is much like the Apollo moonshot. Both are concerted attacks on unsolved, high-risk problems -- taking time, courage, dedication, multidisciplinary government/industry organization, massive financial and manpower resources, and navigation through a perilous and evolving course of trial and error.
But this problem of social media is far more consequential than the moonshot. “The lamps are going out all over the free world, and we shall not see them lit again in our lifetime” (paraphrasing Sir Edward Grey as the First World War began) -- this could apply within a very few years. We face the birthing of the next stage of democracy -- much as after Gutenberg, industrialization, and mass media. No one said this would be easy, and our neglect over the past two decades has made it much harder. It is not enough to sound alarms – or to ride off in ill-conceived directions. But there is reason to be optimistic -- if we are serious about getting our act together.
Running updates and additional commentary on these important issues can be found on Reisman’s blog.
No comments:
Post a Comment