Difference between revisions of "Panel Subject 2019: Platform liability for user content and commerce"

From IGF-USA Wiki
Jump to: navigation, search
 
Line 7: Line 7:
 
!Important Links
 
!Important Links
 
|-
 
|-
|[https://docs.google.com/document/d/1gyITjMx1Oc2C4h4ex_wGIVJFS6KyoT_bILHUOzB8Y_Y/edit?usp=sharing Collaborative Planning Document]
+
|[https://docs.google.com/document/d/1108s-52tzz0YzPMhyDAEX8T-bBxKRqTqYYg0Q-S_zaY/edit?usp=sharing Collaborative Planning Document]
 
|-
 
|-
 
|[https://docs.google.com/spreadsheets/d/1rijGm_FWijsXaFcCtSTk7JRYg7dOl8_Q-JJFtHpKp38/edit#gid=396576269 Panel Team Committees]
 
|[https://docs.google.com/spreadsheets/d/1rijGm_FWijsXaFcCtSTk7JRYg7dOl8_Q-JJFtHpKp38/edit#gid=396576269 Panel Team Committees]

Latest revision as of 18:17, 29 March 2019

{{#invoke:Clickable button 2|main}}  {{#invoke:Clickable button 2|main}}  {{#invoke:Clickable button 2|main}}  {{#invoke:Clickable button 2|main}}

This page is for coordination among the panel team to openly discuss the topics that will be covered under the subject of the "Platform liability for user content and commerce" The page includes the relevant survey results, Panel Guidelines and a section for the panel team to discuss in the comments.

Important Links

Important Links
Collaborative Planning Document
Panel Team Committees
Join the Team

Panel Guidelines

Panel teams should use this process to discuss the panel and get as close as possible to consensus on the following items by April 17.

  • Decide on a concrete subject for the panel based upon discussion and rough consensus. The subject and process should take into account:
    • The IGF-USA Principles
    • The working title, working description and related submissions. See Below
  • Assign team leader(s) and a representative to interface with the steering committee and provide ongoing up to date information to the wrangler and secretariat.

Survey Topic

Working Title: The key to thriving content platforms

Working Description: This session will educate participants on a threatened component of existing communications law that has enabled user-generated content for the last two decades. Today, platforms and infrastructure providers are not held liable for the content created or posted by users of their services - the user is responsible for their content. Section 230 of the Communications Decency Act is under attack by draft legislation and trade policies. Attend this session to understand why and what you can do to stop new platform liability, which would make it too risky and expensive for any but the very largest online platforms to host any user- generated news, views, or commerce.

Related Submissions

<- Back to All Topics

Timestamp Submission Number Submission Issue Areas Comments SG
Timestamp Submission Number Submission Issue Areas Comments SG
2019-02-04 9:16:07 AM 2019 Submission 14 Content moderation Content Policy Laws passed recently in Europe (such as https://law.yale.edu/mfia/case-disclosed/germanys-netzdg-and-threat-online-free-speech) require more aggressive enforcement of moderation policies on social media and other internet platforms. To what extent can platforms preempt this type of regulation with better enforcement? What are the dangers to freedom of expression? In what ways does objectionable content and overmoderation affect historically disadvantaged groups? Civil Society / Academia
2019-02-04 11:11:48 AM 2019 Submission 16 Fight with fake and deceptive content especially on social media Content Policy Government / Intergovernmental Organization
2019-02-05 5:59:19 AM 2019 Submission 28 Platform Responsibilty Content Policy Civil Society / Academia
2019-02-05 6:18:28 PM 2019 Submission 29 Weighing costs/benefits to regulation, privacy, antitrust, section 230 Content Policy Civil Society / Academia
2019-02-06 5:42:17 PM 2019 Submission 33 Regulation & self-regulation of social media platforms Content Policy This panel would look at Facebook's proposal for an advisory council (self-regulation), Article 19's proposal for Social Media Councils, and proposals and efforts in the EU and US to regulate content. What are the opportunities and pitfalls associated with these efforts, and what would it mean for the future of free expression, journalism and democracy Civil Society / Academia
2019-02-07 12:50:38 PM 2019 Submission 38 Child sexual abuse prevention versus sexual speech. Can we have both freedom of expression and act against child exploitation? Content Policy Civil Society / Academia
2019-02-07 4:46:34 PM 2019 Submission 40 Disinformation, misinformation, fake information and its affect on trust in the online world - personal and economic life Content Policy Private Sector
2019-02-11 1:49:54 PM 2019 Submission 61 Free Speech Or Hate Speech: Should Online Due Diligence Change? Content Policy We recently submitted the following for consideration at RightsCon and it may be relevant for IGF USA as well: New hate groups are appearing in rapid, successive fashion. When the public is outraged at you as a provider, how do you determine if a new group on your network is a hate group? How much research do you do as a provider to determine if they have violated your terms of service agreement - yet? This session will discuss difficult real and possible scenarios. We’ll show how Internet companies can set clear and open criteria around acceptable usage.​ Private Sector
2019-02-11 8:21:08 PM 2019 Submission 73 Platform liability for user content Content Policy Limits on platform liability are the backbone of net 2.0 since it enabled user-created content. Today, these liability limits are under threat from multiple business and political interests wanting to turn back the clock on the internet economy and reestablish their control over information publication. Debates on platform liability will help educate attendees and viewers on what has enabled massive growth in user-created content online over the last two decades. Private Sector
2019-02-11 8:25:53 PM 2019 Submission 74 Moderation of user content on social media platforms Content Policy As pressure has increased on social media to improve content moderation on their platforms, critics have argued that these efforts lack transparency and are too strict. A discussion on how social media platforms should balance these opposing pressures is key in educating the public, government, and civil society in the intricacies and challenges facing platforms that work to responsibly moderate content. None / Other