<p>One of the most unsettling revelations in the cache of internal documents leaked by former Facebook employee Frances Haugen has been just how little we know about Facebook, and consequently how unprepared our political culture is to do anything about it, whatever <em>it </em>is.</p>.<p>That’s the first problem in fixing Facebook — there isn’t much agreement about what, exactly, the problem with Facebook is. The left says it’s Facebook’s amplification of hate, extremism and misinformation about, among other things, vaccines and the last presidential election. President Joe Biden put it bluntly this summer: “They’re killing people.”</p>.<p>Former President Donald Trump and others on the right say the opposite: Social media giants are run by liberals bent on silencing opposing views. In a statement last week, Trump called Mark Zuckerberg, Facebook’s founder, “a criminal” who altered “the course of a Presidential Election.”</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/technology/microsoft-president-brad-smith-says-tech-must-compromise-downplays-metaverse-hype-1047190.html" target="_blank">Microsoft president Brad Smith says tech must compromise, downplays metaverse 'hype'</a></strong></p>.<p>Beyond concerns about the distortion of domestic politics, there are a number of other questions about Facebook, Instagram and WhatsApp — all of which, Zuckerberg announced last week, are now under a new corporate umbrella called Meta. Is Instagram contributing to anxiety and body-shaming among teenagers? Are Facebook’s outrage-juicing algorithms destabilizing developing countries, where the company employs fewer resources to monitor its platform than it does in its large markets? Is Facebook perpetuating racism through biased algorithms? Is it the cause of global polarization, splitting societies into uncooperative in-groups?</p>.<p>Inherent in these concerns is a broader worry — Facebook’s alarming power. The company is among the largest collectors of humanity’s most private information, one of the planet’s most-trafficked sources of news, and it seems to possess the ability, in some degree, to alter public discourse. Worse, essentially all of Facebook’s power is vested in Zuckerberg alone. This feels intolerable; as the philosopher Kanye West put it, “No one man should have all that power.”</p>.<p>So, what to do about all this? In the past few days I asked more than a dozen experts this question. Here are some of their top ideas, and what I think about them.</p>.<p><strong>Break it up</strong></p>.<p>Under the tech-friendly Obama administration, the Justice Department and the Federal Trade Commission allowed Facebook to swallow up quick-growing potential rivals. Splitting Facebook into three or more independent companies would undo that regulatory misstep and instantly reduce Zuckerberg’s power over global discourse.</p>.<p>It could also improve the tenor of social media, as the newly independent networks “would compete with each other by differentiating themselves as better and safer products,” said Matt Stoller, director of research at the American Economic Liberties Project, an anti-monopoly advocacy group.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/technology/facebook-working-on-ways-to-protect-users-in-the-metaverse-1046856.html" target="_blank">'Facebook working on ways to protect users in the metaverse'</a></strong></p>.<p>Still, as Stoller notes, a breakup might be a necessary measure, but it’s hardly sufficient; competition notwithstanding, after a split we’d be left with three networks that retain Facebook’s mountainous data and its many corporate pathologies.</p>.<p>The breakup plan also faces steep hurdles. Over the last few decades, American antitrust law has grown fecklessly friendly to corporations. It’s unclear how to undo that. In June, a federal judge threw out sprawling antitrust cases against Facebook brought by the FTC and 40 states, saying that they had failed to prove that Facebook is a social media monopoly.</p>.<p><strong>Place limits on its content</strong></p>.<p>Imposing rules for what Facebook can and cannot publish or amplify has been a hot topic among politicians. Democrats in Congress have introduced proposals to police misinformation on Facebook, while lawmakers in Texas and Florida have attempted to bar social media companies from kicking people off for speech offenses, among them Trump.</p>.<p>As I wrote last week, these policies give me the creeps, since they inevitably involve the government imposing rules on speech. Just about all of them seem to violate the First Amendment.</p>.<p>Yet bizarrely, content rules have become the leading proposals for fixing Facebook; repeal of Section 230 of the Communications Decency Act — which limits tech platforms’ liability for damages stemming from content posted by users — is often described as a panacea. Among the many ways to address Facebook’s ills, speech rules are the least palatable.</p>.<p><strong>Regulate ‘surveillance capitalism’</strong></p>.<p>Here is a seemingly obvious way to cut Facebook at the knees: Prohibit it from collecting and saving the data it has on us, thereby severely hampering its primary business, targeted advertising.</p>.<p>The rationale for this is straightforward. Imagine we determine that the societal harms generated by “surveillance capitalism,” Harvard professor Shoshana Zuboff’s aptly creepy label for the ad-tech business, poses a collective danger to public safety. In other such industries — automobiles, pharmaceuticals, financial products — we mitigate harms through heavy regulation; the digital ad industry, meanwhile, faces few limits on its conduct.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/national/over-269-million-content-pieces-actioned-on-facebook-in-september-meta-1046416.html" target="_blank">Over 26.9 million content pieces 'actioned' on Facebook in September: Meta</a></strong></p>.<p>So let’s change that. Congress could impose broad rules on how ad behemoths like Facebook and Google collect, save and use personal information. Perhaps more important, it could create a regulatory agency with resources to investigate and enforce the rules.</p>.<p>“At a minimum,” said Roger McNamee, an early Facebook investor who is now one of its most vocal critics, regulators should ban second and third party uses of the most intimate data, “such as health, location, browser history and app data.”</p>.<p>Privacy rules are one of the primary ways European regulators have attempted to curb social media’s effects. So why don’t we hear more about it in America?</p>.<p>I suspect it’s because this is a bigger-than-Facebook solution. All the tech giants — even Apple, which has criticized the digital ad business’s hunger for private data — make billions of dollars from ads, and there are lots of other companies that have grown dependent on ad targeting. When California attempted to improve consumer privacy, corporate lobbyists pushed to get the rules watered down. I worry that Congress wouldn’t fare much better.</p>.<p><strong>Force it to release internal data</strong></p>.<p>Nathaniel Persily, a professor at Stanford Law School, has a neat way of describing the most basic problem in policing Facebook: “At present,” Persily has written, “we do not know even what we do not know” about social media’s effect on the world.</p>.<p>Persily proposes piercing the black box before we do anything else. He has written draft legislation that would compel large tech platforms to provide to outside researchers a range of data about what users see on the service, how they engage with it, and what information the platform provides to advertisers and governments.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/business-news/not-just-facebook-these-companies-are-also-in-quest-for-the-metaverse-1046317.html" target="_blank">Not just Facebook, these companies are also in quest for the metaverse</a></strong></p>.<p>Rashad Robinson, president of the civil rights advocacy group Color of Charge, favored another proposed law, the Algorithmic Justice and Online Platform Transparency Act, which would also require that platforms release data about how they collect and use personal information about, among other demographic categories, users’ race, ethnicity, sex, religion, gender identity, sexual orientation and disability status, in order to show whether their systems are being applied in discriminatory ways.</p>.<p>Tech companies savor secrecy, but other than their opposition it’s difficult to think of many downsides to transparency mandates. Even if we do nothing to change how Facebook operates, we should at least find out what it’s doing.</p>.<p><strong>Improve digital literacy</strong></p>.<p>Renée DiResta, technical research manager at the Stanford Internet Observatory and a longtime scholar of the anti-vaccine movement’s digital presence, described one idea as “unsexy but important”: Educating the public to resist believing everything they see online.</p>.<p>This is not just a thing for schools; some of the most egregious amplifiers of online mendacity are older people.</p>.<p>What we need, then, is something like a society-wide effort to teach people how to process digital information. For instance, Mike Caulfield, an expert on digital literacy at the University of Washington, has developed a four-step process called SIFT to assess the veracity of information. After Caufield’s process becomes ingrained in his students, he has said, “we’re seeing students come to better judgments about sources and claims in 90 seconds than they used to in 20 minutes.”</p>.<p><strong>Do nothing</strong></p>.<p>In his new book, “Tech Panic: Why We Shouldn’t Fear Facebook and the Future,” Robby Soave, an editor at Reason magazine, argues that the media and lawmakers have become too worked up about the dangers posed by Facebook.</p>.<p>He doesn’t disagree that the company’s rise has had some terrible effects, but he worries that some proposals could exacerbate Facebook’s dominance — a point with which I agree.</p>.<p>The best remedy for Facebook, Soave told me in an email, is to “do nothing, and watch as Facebook gradually collapses on its own.”</p>.<p>Soave’s argument is not unreasonable. Once-indomitable tech companies have fallen before. Facebook still makes lots of money, but it has lost consumers’ trust, its employees are upset and leaking left and right, and because most of its popular products were acquired through acquisitions — which regulators are likely to bar in the future — it seems unlikely to innovate its way out of its troubles.</p>.<p>I don’t agree with Soave that we should do absolutely nothing about Facebook. I would favor strong privacy and transparency rules.</p>.<p>But Soave will probably get what he wants. As long as there’s wide disagreement among politicians about how to address Facebook’s ills, doing nothing might be the likeliest outcome.</p>.<p><strong>Check out DH's latest videos:</strong></p>
<p>One of the most unsettling revelations in the cache of internal documents leaked by former Facebook employee Frances Haugen has been just how little we know about Facebook, and consequently how unprepared our political culture is to do anything about it, whatever <em>it </em>is.</p>.<p>That’s the first problem in fixing Facebook — there isn’t much agreement about what, exactly, the problem with Facebook is. The left says it’s Facebook’s amplification of hate, extremism and misinformation about, among other things, vaccines and the last presidential election. President Joe Biden put it bluntly this summer: “They’re killing people.”</p>.<p>Former President Donald Trump and others on the right say the opposite: Social media giants are run by liberals bent on silencing opposing views. In a statement last week, Trump called Mark Zuckerberg, Facebook’s founder, “a criminal” who altered “the course of a Presidential Election.”</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/technology/microsoft-president-brad-smith-says-tech-must-compromise-downplays-metaverse-hype-1047190.html" target="_blank">Microsoft president Brad Smith says tech must compromise, downplays metaverse 'hype'</a></strong></p>.<p>Beyond concerns about the distortion of domestic politics, there are a number of other questions about Facebook, Instagram and WhatsApp — all of which, Zuckerberg announced last week, are now under a new corporate umbrella called Meta. Is Instagram contributing to anxiety and body-shaming among teenagers? Are Facebook’s outrage-juicing algorithms destabilizing developing countries, where the company employs fewer resources to monitor its platform than it does in its large markets? Is Facebook perpetuating racism through biased algorithms? Is it the cause of global polarization, splitting societies into uncooperative in-groups?</p>.<p>Inherent in these concerns is a broader worry — Facebook’s alarming power. The company is among the largest collectors of humanity’s most private information, one of the planet’s most-trafficked sources of news, and it seems to possess the ability, in some degree, to alter public discourse. Worse, essentially all of Facebook’s power is vested in Zuckerberg alone. This feels intolerable; as the philosopher Kanye West put it, “No one man should have all that power.”</p>.<p>So, what to do about all this? In the past few days I asked more than a dozen experts this question. Here are some of their top ideas, and what I think about them.</p>.<p><strong>Break it up</strong></p>.<p>Under the tech-friendly Obama administration, the Justice Department and the Federal Trade Commission allowed Facebook to swallow up quick-growing potential rivals. Splitting Facebook into three or more independent companies would undo that regulatory misstep and instantly reduce Zuckerberg’s power over global discourse.</p>.<p>It could also improve the tenor of social media, as the newly independent networks “would compete with each other by differentiating themselves as better and safer products,” said Matt Stoller, director of research at the American Economic Liberties Project, an anti-monopoly advocacy group.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/technology/facebook-working-on-ways-to-protect-users-in-the-metaverse-1046856.html" target="_blank">'Facebook working on ways to protect users in the metaverse'</a></strong></p>.<p>Still, as Stoller notes, a breakup might be a necessary measure, but it’s hardly sufficient; competition notwithstanding, after a split we’d be left with three networks that retain Facebook’s mountainous data and its many corporate pathologies.</p>.<p>The breakup plan also faces steep hurdles. Over the last few decades, American antitrust law has grown fecklessly friendly to corporations. It’s unclear how to undo that. In June, a federal judge threw out sprawling antitrust cases against Facebook brought by the FTC and 40 states, saying that they had failed to prove that Facebook is a social media monopoly.</p>.<p><strong>Place limits on its content</strong></p>.<p>Imposing rules for what Facebook can and cannot publish or amplify has been a hot topic among politicians. Democrats in Congress have introduced proposals to police misinformation on Facebook, while lawmakers in Texas and Florida have attempted to bar social media companies from kicking people off for speech offenses, among them Trump.</p>.<p>As I wrote last week, these policies give me the creeps, since they inevitably involve the government imposing rules on speech. Just about all of them seem to violate the First Amendment.</p>.<p>Yet bizarrely, content rules have become the leading proposals for fixing Facebook; repeal of Section 230 of the Communications Decency Act — which limits tech platforms’ liability for damages stemming from content posted by users — is often described as a panacea. Among the many ways to address Facebook’s ills, speech rules are the least palatable.</p>.<p><strong>Regulate ‘surveillance capitalism’</strong></p>.<p>Here is a seemingly obvious way to cut Facebook at the knees: Prohibit it from collecting and saving the data it has on us, thereby severely hampering its primary business, targeted advertising.</p>.<p>The rationale for this is straightforward. Imagine we determine that the societal harms generated by “surveillance capitalism,” Harvard professor Shoshana Zuboff’s aptly creepy label for the ad-tech business, poses a collective danger to public safety. In other such industries — automobiles, pharmaceuticals, financial products — we mitigate harms through heavy regulation; the digital ad industry, meanwhile, faces few limits on its conduct.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/national/over-269-million-content-pieces-actioned-on-facebook-in-september-meta-1046416.html" target="_blank">Over 26.9 million content pieces 'actioned' on Facebook in September: Meta</a></strong></p>.<p>So let’s change that. Congress could impose broad rules on how ad behemoths like Facebook and Google collect, save and use personal information. Perhaps more important, it could create a regulatory agency with resources to investigate and enforce the rules.</p>.<p>“At a minimum,” said Roger McNamee, an early Facebook investor who is now one of its most vocal critics, regulators should ban second and third party uses of the most intimate data, “such as health, location, browser history and app data.”</p>.<p>Privacy rules are one of the primary ways European regulators have attempted to curb social media’s effects. So why don’t we hear more about it in America?</p>.<p>I suspect it’s because this is a bigger-than-Facebook solution. All the tech giants — even Apple, which has criticized the digital ad business’s hunger for private data — make billions of dollars from ads, and there are lots of other companies that have grown dependent on ad targeting. When California attempted to improve consumer privacy, corporate lobbyists pushed to get the rules watered down. I worry that Congress wouldn’t fare much better.</p>.<p><strong>Force it to release internal data</strong></p>.<p>Nathaniel Persily, a professor at Stanford Law School, has a neat way of describing the most basic problem in policing Facebook: “At present,” Persily has written, “we do not know even what we do not know” about social media’s effect on the world.</p>.<p>Persily proposes piercing the black box before we do anything else. He has written draft legislation that would compel large tech platforms to provide to outside researchers a range of data about what users see on the service, how they engage with it, and what information the platform provides to advertisers and governments.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/business-news/not-just-facebook-these-companies-are-also-in-quest-for-the-metaverse-1046317.html" target="_blank">Not just Facebook, these companies are also in quest for the metaverse</a></strong></p>.<p>Rashad Robinson, president of the civil rights advocacy group Color of Charge, favored another proposed law, the Algorithmic Justice and Online Platform Transparency Act, which would also require that platforms release data about how they collect and use personal information about, among other demographic categories, users’ race, ethnicity, sex, religion, gender identity, sexual orientation and disability status, in order to show whether their systems are being applied in discriminatory ways.</p>.<p>Tech companies savor secrecy, but other than their opposition it’s difficult to think of many downsides to transparency mandates. Even if we do nothing to change how Facebook operates, we should at least find out what it’s doing.</p>.<p><strong>Improve digital literacy</strong></p>.<p>Renée DiResta, technical research manager at the Stanford Internet Observatory and a longtime scholar of the anti-vaccine movement’s digital presence, described one idea as “unsexy but important”: Educating the public to resist believing everything they see online.</p>.<p>This is not just a thing for schools; some of the most egregious amplifiers of online mendacity are older people.</p>.<p>What we need, then, is something like a society-wide effort to teach people how to process digital information. For instance, Mike Caulfield, an expert on digital literacy at the University of Washington, has developed a four-step process called SIFT to assess the veracity of information. After Caufield’s process becomes ingrained in his students, he has said, “we’re seeing students come to better judgments about sources and claims in 90 seconds than they used to in 20 minutes.”</p>.<p><strong>Do nothing</strong></p>.<p>In his new book, “Tech Panic: Why We Shouldn’t Fear Facebook and the Future,” Robby Soave, an editor at Reason magazine, argues that the media and lawmakers have become too worked up about the dangers posed by Facebook.</p>.<p>He doesn’t disagree that the company’s rise has had some terrible effects, but he worries that some proposals could exacerbate Facebook’s dominance — a point with which I agree.</p>.<p>The best remedy for Facebook, Soave told me in an email, is to “do nothing, and watch as Facebook gradually collapses on its own.”</p>.<p>Soave’s argument is not unreasonable. Once-indomitable tech companies have fallen before. Facebook still makes lots of money, but it has lost consumers’ trust, its employees are upset and leaking left and right, and because most of its popular products were acquired through acquisitions — which regulators are likely to bar in the future — it seems unlikely to innovate its way out of its troubles.</p>.<p>I don’t agree with Soave that we should do absolutely nothing about Facebook. I would favor strong privacy and transparency rules.</p>.<p>But Soave will probably get what he wants. As long as there’s wide disagreement among politicians about how to address Facebook’s ills, doing nothing might be the likeliest outcome.</p>.<p><strong>Check out DH's latest videos:</strong></p>