Don’t leave developers in the Section 230 debate

by Janice Allen
0 comments

Last week, the U.S. Supreme Court first reviewed Section 230 of the Communications Decency Act of 1996. oral arguments in the Gonzalez v. Google caseimportant questions were raised about platform responsibility and the risk of viral content.

As the court wrestles with these questions, it’s an opportunity to reflect on why 230 was created in the first place, how it fosters innovation, and what we all could lose if the protections embedded in 230 are narrowed.

Section 230, nicknamed the “26 Words That Made the Internet” by Jeff Kosseff, has established a liability shield for platforms hosting third-party content. In the early days of the Internet, 230 created favorable legal conditions for startups and entrepreneurs to flourish, cementing the United States as a world leader in software.

While today’s tech landscape differs dramatically from the fledgling Internet of the 1990s, the reasoning behind Section 230 still holds true today. The architecture of law creates conditions for innovation and can also cool it down.

Seemingly lost in arguments centered on the outsized influence of major social media platforms is an appreciation of how Section 230 supports the wider online ecosystem, especially software developers. Developers are at the heart of our online world and are at the forefront of creating solutions to global challenges, working to make the software that underpins our digital infrastructure more secure, reliable and secure.

Policymakers must recognize the critical role of developers and work to support them, not stifle innovation.

Developers rely on 230 to collaborate on platforms like GitHub and to build and operate new platforms that rethink social media. Restricting 230 protections can have far-reaching consequences and introduce legal uncertainty into the important work of software developers, startups and platforms that provide them with the tools to realize their vision. As policymakers contemplate new boundaries of intermediary liability, it is essential to put developers at the center of decisions that will shape the future of the internet.

Software developers contribute significantly to the economic competitiveness and innovation of the United States and are key stakeholders in platform policy. GitHub has 17 million US developers on our platform – more than any other country. Their open source business alone contributes more than $100 billion annually to the US economy.

These developers maintain the invisible but essential software infrastructure that powers our daily lives. Almost all software – 97% – contains open source components, often developed and maintained on GitHub.

As chief legal officer at GitHub, a global community of over 100 million software developers collaborating on code, I know firsthand the importance of keeping 230 intact. While GitHub is far from a generic social media platform, GitHub relies on 230 protections to both host third-party content and act in good faith moderate content.

That’s especially important when a platform has more than 330 million software repositories. GitHub has been able to grow while maintaining the health of the platform thanks to intermediary liability protection. GitHub has a robust, developer-first approach to content moderation to keep our platform safe, sound and inclusive while aligning our approach with the unique environment of code collaboration where the removal of a single project can have significant downstream effects for thousands or more software projects.

When it comes to the details of the Gonzalez v. Google case, which asks the court to consider whether Section 230 liability protections should include third-party content recommended by algorithms, a ruling in favor of the petitioners could have unintended consequences have for developers. Recommendation algorithms are used in software development in numerous ways that differ from common social media platforms.

GitHub’s contributions to The amicus brief from Microsoft In the case study, outline our concerns: Recommendations powered by algorithms on GitHub are used to connect users with similar interests, help them find relevant software projects, and are even used to recommend ways to improve code and fix software vulnerabilities. unload. An example of this is that of GitHub CodeQLa semantic code analysis engine that helps developers discover vulnerabilities and bugs in open source code.

Developers use GitHub to maintain open source projects that use algorithmic recommendations to block hate speech and remove malicious code. A court decision to limit 230 to exclude protection for recommendation algorithms could quickly ensnare a variety of socially valuable services, including tools that maintain the quality and security of the software supply chain.

A ruling in Gonzalez v. Google seeking to withdraw protections in favor of social media platforms has the potential to impact a much wider community. Leading up to court hearing the case, a large number of amicus briefs emphasized the far-reaching implications: from nonprofits (Wikimedia Foundation) to community content moderation (Reddit and Reddit moderators) and small businesses and startups (engine).

While calls to shrink 230 are primarily aimed at keeping Big Tech in check, doing so would inadvertently curb competition and innovation while creating additional barriers to entry for the next generation of developers and emerging providers.

These concerns are not exaggerated: In “How the law created Silicon Valley”, Anupam Chander examines how the US legal system created favorable conditions for internet entrepreneurship in contrast to Europe, where “concern about copyright violations and strict privacy protections hampered internet start-ups”, and Asia, where “Asian web companies faced not only copyright and privacy restrictions, but also strict liability rules for intermediaries.”

Shrinking from 230 would not only hurt the United States’ global competitiveness; it would hinder technological progress inside the US While GitHub has come a long way since our start-up began, we strive to level the playing field so that anyone, anywhere can be a developer.

As we await the court’s decision in Gonzalez v. Google, it’s important to note that regardless of the outcome of the case, there will certainly be more efforts to shrink 230, whether they focus on algorithmic recommendations , AI or other innovations. While these new technologies raise important questions about the future of intermediary liability, policymakers should strive to chart a way forward that creates a legal environment that supports developers, startups, small businesses, and nonprofits that take advantage of so many socially beneficial parts of the Internet. drive.

Policy makers concerned about reducing harmful content can look at how developers are taking the lead in content moderation. Developers use GitHub to develop valuable software projects, including open source content moderation algorithms that reflect policymakers’ calls for algorithmic transparency on platforms such as the Algorithmic Accountability Act of 2022 and the Law on Algorithmic Justice and Online Platform Transparency.

Platforms included Twitter, bumble And Wikimedia have used GitHub to share the source code for algorithms that flag misinformation, filter obscene images, and block spam, respectively. Open source drives innovation in content moderation while providing new models for community participation, oversight, and transparency.

As we encounter new frontiers in intermediary liability, policymakers must recognize the critical role of developers and work to support – not stifle – innovation.

You may also like

All Right Reserved Businesskinda.com