The Supreme Court case could kill Facebook and other social networks, allowing blockchain to replace them

The Internet - arguably the greatest invention in human history - has gone awry. We  all feel it. It's harder than ever to tell if we're friend or foe (or bots), we know we're constantly being tracked for better ad conversions, and we live in constant fear that we'll click something and get scammed. 

 The failures of the Internet are largely due to the inability of the big technological monopolies—especially Google and Facebook—to secure and protect our identities. Why don't they? 
 The answer is that they have no incentive to do so. In fact, the status quo is fine with them, thanks to Section 230 of the Communications Decency Act passed by the US Congress in 1996. 

 But things can change. Meanwhile, the Supreme Court hears the case of Gonzalez v. Google, which could amend or even repeal Section 230. It's hard to imagine a scenario where it doesn't kill the social media platforms we use today. This would give blockchain technology a great opportunity to replace them. 

 How did we get here? 

 A major contributor to the early development of the Internet, Section 230 states that online platforms are not legally responsible for the content posted by their users. As a result, social media networks like Facebook and Twitter can post (and profit from) whatever their users post. 

 According to the plaintiff of the pending case, Internet platforms are to blame for the death of his daughter, who was killed by Islamic State attackers in a Paris restaurant in 2015. He believes that algorithms developed by YouTube and its parent company Google "suggested ISIS- videos to users," which promoted recruitment for the terrorist organization and ultimately contributed to the Paris attack. 

 Section 230 gives YouTube a lot of coverage. If a user posts defamatory or, in the above case, violent content, the platform may be served. satisfied to many consumers before taking any action. Deciding whether the content violates the law or the terms of the platform can cause a lot of damage. But section 230 protects the platform. 

 Imagine  YouTube after Section 230 is repealed. Does it have to queue 500 hours of content  uploaded every minute  before anyone else can see it? It wouldn't scale and would take away much of the attractive interchangeability of the site's content. Or would they allow content to be published as is, but take legal responsibility for every copyright infringement, incitement to violence, or defamatory word uttered in one of its billions of videos? 
 With Section 230 pulling the strings, platforms like YouTube will quickly unravel.

Global Implications for the Future of Social Media 

 The case focuses on US law, but the issues it raises are global. Other countries are also struggling with the regulation of online platforms, especially social media. France recently ordered manufacturers to install easily accessible parental controls on all computers and devices and banned the collection of  data from minors for commercial purposes. In the UK, it was officially found that the Instagram algorithm contributed to the suicide of a teenager. 

 Then there are the  authoritarian governments of the world, whose governments intensify censorship and manipulation, using armies of trolls and bots to sow misinformation and mistrust. Not having a valid ID on most social media  makes this situation not only possible, but inevitable. 

 and Section 230 foreign economic beneficiaries may not be who you expect. Many people are filing lawsuits against the big tech platforms. In a world where social media could be held legally responsible for the content posted on their platforms, armies of editors and content moderators would have to be assembled to scrutinize every image or word posted on their sites. Given the amount of content published on social media over the last few decades, the task seems almost impossible and would probably be a victory for traditional media organizations. 

 Looking  a little further, repealing Section 230 would completely upend the business models that fueled the growth of social media. Platforms  suddenly have to account for an almost unlimited supply of user-generated content, while increasingly strict privacy laws limit their ability to collect vast amounts of user data. This requires a complete transformation of the concept of social media. 

 Platforms like Twitter and Facebook are misunderstood by many. They believe that the software they use to log into these platforms, post content, and view online content  is a product. This is not. Moderation is a product. And if the Supreme Court strikes down Section 230, it will completely change the products we think of as social media. 

 This is a great opportunity. 

 In 1996, the Internet consisted of a relatively small number of static websites and message boards. It was impossible to predict that its growth would ever cause people to question the  concepts of freedom and security. 

 People have the same basic rights in  digital activities  as in  physical activities - including privacy. At the same time, the common good requires some mechanism that would distinguish facts from false information and honest people in public from fraud. Today's Internet does not satisfy either of these needs. 

 Some argue, overtly or implicitly, that a healthier and healthier digital future requires important trade-offs between privacy and security. But if we are ambitious and determined in our pursuits, we can achieve both. 

 Blockchains make it possible to protect and prove our identity at the same time. Thanks to zero data technology, we can verify information - such as age or professional qualifications - without revealing the resulting information. Soulbound Tokens (SBT), Decentralized Identifiers (DIDs) and some Non-Fungible Tokens (NFTs) will soon enable the transfer of a single cryptographically provable identity on any  current or future digital platform. 

 This is useful for all of us, both in  work, personal and family life. Schools and social media are safer places, adult content can be reliably age-restricted, and deliberate misinformation is easier to track. 
 The end of episode 230 would be an earthquake. But if we approach it constructively, it can also be a great opportunity to improve the internet we know and love. Once our identity is verified and cryptographically proven on the chain, we can better prove who we are, where we are, and who we can trust. 

 Nick Dazé is the co-founder and CEO of Heirloom, a company dedicated to providing code-free tools to help brands create a secure online environment for their customers using blockchain technology. Dazé also co-founded PocketList and was an early team member at Faraday Future ($FFIE), Full Screen (acquired by AT&T), and Bit Kitchen (acquired by Medium). 

 This article is for general information purposes and is not intended to be, and should not be construed as, legal or investment advice. The views, thoughts and opinions expressed here are solely those of the author and do not necessarily reflect or represent the views and opinions of Cointelegraph.

Source :

Post a Comment

Lebih baru Lebih lama