Trending now: Scenarios from the future of Fake News/transcript
>> COURTNEY RADSCH: Thank you so much. Welcome to the IGF USA discussion. My name is Courtney Radsch and I'm very excited to have an excellent panel here today to discuss what is probably one of the most interesting questions facing the Internet now. We have Angie Drobnic Holan who is the editor of PolitiFact and Jessican Leinward who is on the strategic response team at Facebook. And next is Danielle Coffey and last but not least is our token man on the panel is Jeff Jarvis. And I don't know if we have scenarios on the table, but we wanted to talk about and think about what is the future of the Internet based on how we are responding to the so called challenge of Fake News. I want to at the outset start with the recognition that this term is problematic and means lots of different things. I am going to ask the panelists to do shortly, I want them to tell us a bit about what specifically they are working on with respect to Fake News and what they mean by that and how they are defining Fake News and addressing it. Are they talking about disinformation campaigns? Are they talking about satire? Are they talking about addressing news that political leaders don't like, et cetera? Because we believe in constructing the session that the future of the Internet will in large part determine how the world responds to the controversies over misinformation and computational propaganda and the scores of Fake News, because these have economic, social, political and cultural dimensions, not to mention the impact on the fourth core of democracy. There is a scenario with some thoughts on kind of what the future might look like. Are we headed towards a future that maybe we would call Dystopia for Democracy where there is a lack of trust and people are radicalized as they move in to their own information bubbles. Journalists become unable to compete with Fake News because it is just not as interesting or viral or able to take account for the economics of the platform. And what does that mean for public oversight of officials and transparency, et cetera.
Another scenario where, you know, if we don't one thing we want to consider the role of Section 230 and these protections for intermediary liability and how that may facilitate that. But what happens if we don't have that sort of protection. Could there be a convergence towards the mainstreams, Siphoning off the mainstream positions, not allowing anything outside of the mainstream for fear of what might happen and what the platforms might have in that.
And then another scenario where maybe there is some sort of radical transparency at the platform level in terms of both the transparency reporting that is probably happening but also how are algorithms resulting in certain decisions and what is that doing to information online, opening up the moderation decisions, data, advertising information and reengineering the economic incentives and signals that that sends.
So we are going to take a conversation now about what are we doing in this space. So I want to ask each of you to just briefly talk about what you and your organization are working on with respect to Fake News, tell us what you mean by Fake News and we'll just go through that. And we are going to have a conversation and I want to encourage everyone in the audience as well as our remote moderator to raise your hand during the discussion.
First I do want to go through first and talk about what everyone is doing. But then throughout the discussion if you have a thought, you have a question, comment, please raise your hand. I am going to try to integrate it as much as possible.
Let's start at the end so we don't make Jeff feel completely out of his element. Can you talk about what you are doing especially with the News Integrity Initiative and some of the other projects that you are working on, Jeff?
>> JEFF JARVIS: Sure. And as we look at the scenarios I was thinking about as we talked about what I fear is fear itself. I am afraid we are headed towards a moral panic right now that's overblowing the sort of defenses that we are in. We have to be careful to look at real harm and real data. Understand where we are. And we also have been concentrating so much on the negative, I think it is important to also start paying attention to positive, to quality. To give you a very unfair comparison as you heard this morning Facebook now has 50,000 and will soon have 20,000 who are there to become, pardon me for this, the crap detectors. And we as a society we we are pushing them to do that. Meanwhile in the U.S. we have fewer than 30,000 journalists on newspapers. So what's that say about our societal allocation of resources?
We are paying so much attention to the bad stuff that it is taking over the world but it isn't. There is some small number of bad actors that is motivated by troll politics. Russians who are manipulating the weaknesses of our systems as well as Nexus and poor foundation. We started the news integrity initiative which is mainly making grants in areas, in two areas. One is around trust and trying to change the trust in the public and two is around manipulation and misinformation. We have not announced at all, but I will say it anyway. We saw a lot of efforts out there to look at the signals of quality of good and bad. Reporters without reporters. The platforms and the ad community have said to us that's too confusing. Too difficult to work with. So a new project which is to bring together all these signals of quality and put them in one place at one form so the platforms and ad Networks and agencies can are not going to wait list or black list but use it to better inform their decisions and hope will take money and attention away from the crap and start to put it towards the quality.
>> COURTNEY RADSCH: Danielle.
>> DANIELLE COFFEY: Thank you for having me and this is a great panel. So you asked what is Fake News. And we represent 2,000 newspapers across the country. We are not Fake News. We are the opposite. We like to think of ourselves as the we are real news. We employ publishers. We employ our publishers. We employ reporters. We have it in our DNA to make sure that everything that we report on is fact based and verifiable. And we adhere to codes of conducts. And then we go further in our news room to make sure there are standards of care and ethics. And it is not a place where conflict exists and that we have a check on each other.
You know, somebody asked us recently when we are talking about the debate over what responsibilities should folks have whether the platforms or otherwise, and we were asked by one of our tech partners whether what is it you want. Do you not want to have the responsibility that you have. And no, we can't not have this responsibility. It is so ingrained in everything that we do, in every article that we write and we put out in to the public on our pages, whether digital or print. We can't go back in time. That's not what that's not what we want. We want to continue with quality and the care that people raise. We have our brands that we rely on. We are just talking about brands. If somebody doesn't want to read something that we put out there in a particular paper they can go to another paper. But our loyal readers who will stick with us who are particular to that paper trust that what we put out there is accurate. And is trustworthy and we will move that reader if it is not. That's the world that we live in. And we look to partner with those who would seek to would seek to find the values that would reflect that in where it is distributed. So that's our solution.
>> COURTNEY RADSCH: Thank you. Jes from Facebook.
>> JESSICA LEINWARD: Hi, everyone. I am Jessica Leinward. I am a public policy manager at Facebook. At Facebook we are addressing information in three ways. We have a set of global community standards that govern what is and not allowed on Facebook. And if content violates those standards it is removed as soon as we become aware of it. We do see an overlap of information and violations of our community standards whether that's hate speech, bullying in content and credible threats of violence. We don't have a community standard or policy. And these community standards are public by encouraging you to visit. And they are quite detailed and we do have a section on false news and how we define it and think about it. But we don't have a policy that requires people to only post true contents on Facebook.
Another way of saying that we don't remove content simply because it is false. What we have determined is the best and balanced between freedom of expression and safety for our users is to demote content. To show it farther down in the news feed. We believe that not let folks post content that is false. But we don't believe that that content should get the same distribution as other content. And we don't make these determinations ourselves. We rely on third party fact checking partners that are certified by an independent international body to comply with certain international standards. We rely on those judgments to determine whether or not content should be demoted in this feed.
The third part of the model is to inform. We want to inform our users and give members of our community context and they know or they know they can understand they have greater certainty and more context to understand what they should or should not trust on Facebook. And we do that I can think of two examples. One is the article context feature which surfaces basic information about an article or publisher in a news feed and also related articles which is a module that the fact checker writes to get more contact.
>> COURTNEY RADSCH: We are going to do the intro and then questions. Please go ahead.
>> JESSICA LEINWARD: And the last thing I would mention, last week we did announce a policy that is in addition to our community standards and relates to this information. And what that policy says is we now afford have the authority to remove content that is false and that is recorded to be contributing to eminent violence and physical harm. Places like Myanmar and Sri Lanka. They didn't violate our bullying policies. They were false but they were contributing to violence on the ground. And having considered what our responsibility is as a platform we made the determination that we could draw a line around violence, around eminent violence and content. We will not be making those determinations ourselves. We are relying on partnerships with Civil Society Organizations on the ground to make determinations as to both falsity and Internet violence.
>> COURTNEY RADSCH: That's a perfect segway in to Angie.
>> ANGIE DROBNIC HOLAN: I'm Angie Drobnic Holan. I'm the editor of PolitiFact. I have been with the site since its launch in 2007. We are a news organization that focuses on political fact checking. And when we launched our main focus was on fact checking candidates, elections, advocacy groups, super pacts. As time has gone on more of our portfolio has been in Fake News which I define as hoaxes, conspiracies, fabricated content that seems to be intentionally false. It does seem to be some intention that is there a false report. It is not someone flubbing their facts or getting things wrong. Because we have been around so long we have about 13,000 fact checks in our database. A small percentage of these are well, maybe not a small. But a minority are Fake News Internet stuff. Most of our fact checks are people like Barrack Obama, Donald Trump, Chuck Schumer, political ads. We have also been meeting over the last five years with international fact checkers.
There has been a global international fact checking movement that not only comes out of journalism but Civil Society activists. And the common mission is to establish a set of baseline facts around the published discussions that can occur. Because when you have disinformation in the society and it just makes any sort of public policy discussion very difficult because you have to first hash out what are the baseline facts you are dealing with before you can get to policy and solutions policy proposals, that sort of thing.
And with this international network we formed the international fact checking network and we came up with a code of principles that we, PolitiFact was one of the founding members in South Africa and United Kingdom and Argentina and India. Came up with a common set of principles that we felt we could sign on to that involved transparency, nonpartisanship, transparency of sourcing and finances and a commitment to fact checking all sides of public debate. Some that are operating in authoritarian regimes can't meet all the standards for security reasons. They they are still part of our group but they don't sign on to this code of principles that we have.
Shortly after we launched the code of principles Facebook contacted us and asked if the fact checkers who were signatories would work with Facebook to check the news. Facebook makes it digital for us. So we see false reports that are flagged by users and we fact check those reports and then we link our reports back in to the database. It is demoting the particular content that we fact check and allowing Facebook to use this data to teach its platform to detect Fake News before it is even flagged. And I have to say we have been fact checking false news reports for ten years now. We started with chain e mails. This program is particularly sophisticated. And I'm very optimistic that it can be used at other parts of the Internet, this type of machine learning. PolitiFact because we are a news organization in the United States we are believers in the First Amendment. We are not interested in suppressing any sort of diversity of ideas, but our mission is to get accurate, verified information in front of everyday people so that they can govern themselves in a democracy.
>> COURTNEY RADSCH: Great. Why do you think that someone from media to protect journalists, especially about Fake News and the reason is that CPJ we have seen how Fake News has a very real impact on journalists around the world. CPJ's mission is to protect the right of journalists to report news and ensure the free flow of information. Our most recent prison census showed that three times as many countries were imprisoning journalists on Fake News charges. And just last week we reported on Egypt imprisoning even more journalists on Fake News charges. They were saying how Fake News is being used as a use to censor media and journalistic content online.
And also looking at the responsibility of platform, being private companies that they have an interest in upholding the rights. And that's one of the reasons that we have cofound the global network initiative which Facebook is part of where we can have discussions around this. And I think, you know, maybe just to start off the conversation we really seen weaponization of Fake News in the United States, but I think that one of the key issues for a conversation with the Internet Governance Forum in the USA is how the steps that we're taking in the United States to address Fake News can have ramifications around the world both in terms of initiatives, like the global fact checking network which is great. We just had a conference in Rome if I recall correctly. But it can also have implications it can also have implications for the platforms and types of maybe standards they develop, the tools they develop, et cetera.
So just to kick off the conversation and I will be coming as soon as we get in to the conversation, please raise your hands and we will have time at the end for additional comments and perspectives and questions that we maybe not address here. But first what I am seeing, hearing here is a lot of public/private partnerships that we are seeing, you know, Facebook is partnering with PolitiFact, with this international fact checking network and Indonesia is a global conference that the Indonesian Government is somehow involved in that. Vietnam has requested to have more say in what content is allowed online. How do you work on these partnerships? How do you develop these partnerships? How do you decide how to have these partnerships to deal with Fake News without getting co oped by authorities or entities that would rather staunch criticism of them, for example? All of you.
>> So I can speak to fact checking and maybe the trusted partners that we are looking in to for purposes of our new policy that focuses on Internet violence. On fact checking as we were discussing we look to the international fact checking network because as our policy applies globally the standards for partners especially in this space apply globally and that certification for us and the principle more importantly the principles that underlie the certification are very important in terms of who we partner with and how we make those determinations. That's kind of a gold standard in terms of what we look at for a fact checking partner. We were discussing before we started the panel that, of course, there is an international life and we scaled the fact checking program to 17 countries thus far and we are trying to expand further than that. But every country has a different set of circumstances political, social, economic, and that we are we have to grapple with and think about before we move in to that country, including protection of fact checkers on the ground and of journalists and what the locations may be for fact checking partnership there.
In terms of trusted partners, we are still in the process of implementing a policy and it will also be a gradual process because of because each country might that we are looking to expand will have a different set of circumstances to contend with. But we initially on Myanmar, Sri Lanka, India, Cameroon and the Central African Republic and we are looking at Civil Society Organizations, organizations that are independent, worked with before and have a good relationship with and have no history of spreading this information, things of that nature. Those kinds of criteria.
>> COURTNEY RADSCH: Speaking of credibility, credibility has signals. And I know that, Jeff, as part of the initiative that you are working on you are looking at signals as a way to combat. What are some of those signals that you are looking at?
>> JEFF JARVIS: There is various sets of signals. There are initiatives now called trust project in the U.S., borders without borders in Europe that are working to get these organizations to ascribe to certain standards like label opinion versus news. And so those were self reported signals. There is also behavioral signals, news site fails 50% of fact checks that would say a lot about them. There's endorsement, belonging to the organization and you have high standards, then that's a signal. There is a reserve signal which is very important. My project would be hiring a company called Trust Metrics to be able to look at thousands of sites against certain criteria. There is diversity as a signal. Make sure that whatever total you end up with has diverse viewpoints and public opinion which on its own falsely reported that Facebook is looking at raw trusted news as a first effort to try to identify news. If that's all you use, that wouldn't be very good. But it is a good signal.
And the where you use journalism in Oxford does very good work here. And then finally there is some alerts of misinformation, misinformation campaign. Organizations like data society and people who are embedded in bad guys. It is about being false in a way to manipulate the system for trust and a lot with Russian misinformation. And that gets to a whole different level of trying to figure this stuff out.
We go to the elephant in the room, is (inaudible) where just now today I think it was Facebook taking down four videos and Youtube is taking down videos. And so there is also a question of where the line is. Trying to sympathetically Facebook's difficult, it was violence, Internet violence and in my own opinion it is not far enough. If you really try to define harm, it is extremely difficult. There might be a whole bunch of areas of trust is a horrible standard. You don't want to become the decider of truth. Threat think is a business standard and so on and so on. And there is plain manipulation who are trying to use the systems and the platforms need to get to the point where they have the right and responsibility to counteract that manipulation. Try to write that standard. It is really a hard line to write. All we are doing is gathering quality signals to enable the platforms and to use them whatever data they have to try to figure out the judgments make and make better judgments. But that's just one small step on a very complex nuanced path here.
>> COURTNEY RADSCH: Okay. Go ahead. Go ahead.
>> Do I need a microphone? You have to wait for the microphone because of the stream.
>> (Off microphone).
>> Not coming through.
>> COURTNEY RADSCH: While we're figuring out while we're figuring out the technical issues, we got someone Jeff, why don't you come back
>> All right. It is not really that important of a question.
>> Here we go. Does this work? Okay. Great. So I have been thinking a lot about how your traditional newspaper has some signals embedded in it. We got used to the fact that there is a mass head and publishers and editors and we wrote it and the date when they wrote it. And we have certain assumptions about the sourcing, they try to indicate when it is a direct quote and, et cetera, et cetera. And that then you see a news or information on the Internet it has been removed from all that context and it feels some of the things that you have all said that were kind of reinventing the wheel a little bit. And I can understand that that traditional signalling user interface, whatever you want to call it, may lack some things like diversity. But it also seems weird that we are reinventing the wheel, especially when so many of us have embedded those traditional practices in our heads. And part of what we have seen on the Internet has been unfortunate in social media is that the trust that we have grown to have even in the mainstream news media would transfer that to those other sites.
Assume if it has a certain font and a certain look, it follows certain protocols and as that so the mainstream news media has come down in credibility and those other sites have come up more than they should. And I am wondering if we can just if there is some way to restore that difference. I am won instead of focusing on the individual news item and when it is false, giving people a tally based on how many false items they put up, couldn't we get back to what some of the transparency and context that we provide us, what should you provide and then white listing some of those entities. I hope that wasn't too convoluted.
>> COURTNEY RADSCH: Do you want to address that?
>> JEFF JARVIS: You should look up the trust project. That's pretty much what they have done. One is that I think we are better in the news industry but we still need to get better yet. Some places opinion versus fact and even thought that justified right in the type setting versus ragged right. People will figure this out. So we can do a lot better job of writing what we have. And No. 2 that would be 3. Is that ask somebody at Google what they would want for news. They would want is we would endorse each other. If we link to each other would be a tremendous help. We have a very bad culture of linking in the news media. It would be incredibly helpful to the platforms. Third though in the news business we push Facebook and Google and such to promote our brands or bring our brands. They don't know the brand. A study out two weeks ago that said that same news item and associate with the brand trust went down.
>> COURTNEY RADSCH: And related to that was a peer research center study that showed that the majority of people who said they get their news online could only name the platform but not the brand. I think your question actually points to a key issue that we are not necessarily going to delve in here, which is information literacy and ensuring that people who are getting these online will understand the signals. And Danielle, I'm going to go to you in a second.
>> DANIELLE COFFEY: That comment that you made I thought that was great, and all that metadata and information and things that we have ingrained in what we do in the analog world to be transitioned from the world and I would take that one step further and I would say all these signals if it could be used in the algorithm, and here's my pitch to you Jes, if it could be used in the algorithm to uplift credible and trustworthy and quality content, premium news content that when people who hire reporters. It is not a hard thing to define what news is. We are doing it for the reasons. Why not use it and then use the algorithm to uplift that quality content. And then it would be a win win because you are sustaining a quality news industry. So I think, you know, something that we would highly support if that's something that was considered. And I love that comment.
>> COURTNEY RADSCH: Let me ask the panel, look at scenario 2, how signals can be used to weed untrustworthy, poor quality content. We got scenario 2 which is this idea that there is kind of this mainstream, only mainstream views are able to get out there. Mainstream media, what happens to left winged, right winged kind of nonmainstream points of view, media outlets, citizen journalists, a lot of the things that I think we are talking about signals here. How do those apply globally, right? What does that do to an independent news outlet that's struggling to survive online in Egypt? If we are talking about signals and then we talk about linking back, I don't think you are rarely going to get say the New York Times to link to a citizen journalist who is reporting on the ground. So how do we how do we prevent a scenario 2 where it becomes only a handful of news organizations that are out there online that are able to survive on the platform. And I wanted to bring in a recent Article from Vice which is shadow banning of conservatives. There have been acquisition of multiple platforms about the ability of conservative races
>> JEFF JARVIS: That's an example of what we are trying to fight here. That the President of the United States somewhere near us Tweeted that Twitter was shadow banning conservatives. Thousands of people who were affected by a bug and Vice was wrong.
>> COURTNEY RADSCH: But we do know that Twitter has removed conservative journalists who were propagating either
>> No. (Off microphone).
>> JEFF JARVIS: Yeah. Be careful here.
>> COURTNEY RADSCH: So then it is not Fake News, right? It is a report that is then debunked. How do you deal with this? And let's bring this to PolitiFact, this is what you do. You get an initial report and you have an initial especially a journalistic report and then maybe you get more information, the companies open up that your data and turns out not to be true. How do you do this?
>> ANGIE DROBNIC HOLAN: The confusion we are having at this particular moment in time goes to the fact that the platforms have so much power to curate content. People on Facebook they think they are getting information from their family and friends. So there is a lot of trust that goes to the situation that the audience member was talking about. It is very easy to put together a website that looks professional, legit. I am not particularly worried about extreme views being pushed like pushed to the margins. Like this kind of Dystopian scenario that people who have unpopular opinions are not allowed to speak. That's what has disrupted all the legacy news organizations and the business model issues.
I see a big problem with I see some problems with the audience itself where they see that they can get breaking news so quickly and easily and cheaply just looking at the phone. They don't even know where the information comes from. And the news organizations that put money and resources in to reporting aren't being supported in the way they were pre Internet. So there is a huge business model problem and I think that also goes to the signals problem. There is a connection there. Because before the Internet, people wouldn't be able to have the resources to publish their views didn't have a certain sort of cultural cache behind them.
Now on the Internet anybody can publish at any time. I don't see that going away. Even groups that have been have left Twitter, have formed alternative social networks where they are talking on these alternative social sites. The argument to me is not about censorship. It is more about like my voice isn't loud enough. My voice is here but my voice is here on the Internet but it is not loud enough. And I'm going to use my political influence to address that situation.
>> JEFF JARVIS: Yeah, a member of Congress complained that gateway was not getting enough traffic from Facebook.
>> COURTNEY RADSCH: In Azerbaijan, one of the only independent television broadcasters that broadcasts solely on Youtube was taken offline. In Egypt one of the independent reporters had his several of his social media accounts closed down. So, you know, I think there has been a lot of scrutiny about claims in here but we are seeing a real impact. Is it Brian?
>> I want to add, second what this lady said with the beautiful necklace. It is not the people that are complaining. They are masquerading as journalists and spreading this bullshit and all the worst kinds of content that plays to the human id. These platforms have every right by statute under the First Amendment to take that stuff down. That's what Section 230 was intended to do. And Republicans have managed to summersault and completely reverse their positions. Having spent six decades we should never have a doctrine. Government should never be in a position to decide what media is fair. Fair to conservatives as Jeff said how dare they do anything to deprioritized content from sites like Gateway permit, spread rumors and falsehoods and other forms of unadulterated bullshit. I testified about this before the House Judiciary Committee and not one Republican in the entire Committee was willing to acknowledge that none of this is anything the Government can actually legally touch on the First Amendment. And they had to hear it from Democrats, Ted Louis said it beautifully. It was a stupid hearing and had every right to go after this problem. People are masquerading as journalists and taking advantage of. The conservative journalists are being harmed and they are not journalists. And these sites have no obligation to treat them as such.
>> Can I take this one? As you were testifying our CEO David and I were behind you in every picture during that hearing. But during that discussion they actually made a good point. They said okay, if this was your message that was being suppressed, and they have said if this was your message being suppressed and you didn't know why, you would feel the same way. I actually thought that was a good point because I think the discomfort, it is the conservative screaming this time, whoever's message is not being taken down, they don't know why and I think it is this lack of comfort with knowing when and why and lack of transparency and the algorithm. Just makes people uncomfortable. I think it is a natural thing. And I think that the platform trying to be as transparent as possible but without having that control and something somebody else have the control of your speech I think it is where the tension comes in.
>> (Off microphone).
>> (Off microphone). They are being censored. The platforms could do more. There is a level of human stupidity where you can only do so much and people will still feel persecuted. And when their ad is blocked or they don't do as well as they think they should, it will lead them to think they are being persecuted.
>> I want to take your point, I want to go down the row, given this environment, does fact checking have any impact in this world? Does it matter given what we are talking about about confirmation bias and political view and what the studies are showing, does fact checking have any impact?
>> It does have an impact. It is not a cure all or a solution, but it makes the real world, on the ground situation incrementally better and I think it needs to spread.
>> I agree. I think fact checking in making a world different, it is making a difference for us. As I mentioned we have scaled the program to 17 countries. I think we have more work to do. Transparency, we are trying to be more transparent. We are transparent to users when they post something and notify them. I think there is more work to be done. And we are always looking to improve and I think but as a general matter I am very supportive of the program.
>> Yeah, that's in our DNA. That's what we do. We just wish it was rewarded a little bit more for having it be such a big part of what we do. Great partner in the fact checking.
>> But to press you on that, when you are saying and I agree. The DNA of journalism is not getting the facts right, but we are seeing the challenges of competing in the current digital space. So isn't having an impact or are you seeing are you seeing that your constituents have to adapt to this new model, could be headlines trying to compete with Fake News.
>> Two things on that real quick. Marketplace of ideas is premised with the idea that the more information you have out there will correct the misinformation and people could organizations like (inaudible) are trying to do that, weed out the I don't know, that's a just big (inaudible). From our perspective as far as being able to weed out it has got to be difficult for the platforms because it is your financial incentivized to be able to have something that's more to the more attractive to get more clicks. If the traffic is probably the way it works. You have to you would have to significantly tweak your algorithm to not have it want traffic. And that from
(Talking at the same time).
>> COURTNEY RADSCH: But just focusing on whether fact checking has an impact in this environment.
>> Oh, yeah, of course.
>> JEFF JARVIS: Of course, everyone does fact checking. It is important as a journalism professor. If you listen to I recommend to you any Dana Boyd, the data society founder, gave a talk about flag she said that we chart our youth to question everything. And No. 2, she argues really a war of alternative facts. And we are in position where we need to look at the question of civility in society. That I people are not open to other viewpoints and even to facts if they are being uncivilized to each other. We have to do something about that. And then also this question, I am liberal and I'm media. I think we in liberal media didn't admit that and they didn't recognize there what a large part of the country was underserved and we left a void. And so it is about that, too. Got to have higher quality news from a conservative viewpoint. We need more diversity in that, too. Fact checking is magnificent, but it presumes that we have solved higher level problems.
>> Just two things to follow up. At PolitiFact we see ourselves as independent. I am continually saving out that space from independent space. And then just one last thought, we are going through a huge economic model change. The platforms I think unintentionally receive the role that newspapers had traditionally. They are gatekeepers and they are reaping the revenues that the newspapers used to receive. They have got it whether they want it or not. And now I feel like we are starting to have more momentum on problems.
>> In my opinion I'm going to say maybe play the devil's advocate, but I question whether it is really having any impact. Facebook tried something where it would mark a story as contested and what did it result in? More sharing. And then we heard you guys have 13,000 you said you have 13,000 fact checks in your database. But Youtube gets, what is it, 400 million videos uploaded every day or every week. I'm sorry, I don't have the stats in front of me. Facebook, how many billion pieces of content. So, you know, I think you could also see a perspective where it is having a minimal impact.
>> So just to pick up on that point in terms of business needed flags, we did see that that was actually having some unintended in consequences. More information is better than less. I do think we are seeing again there is more work to be done on the research side. Getting people more context is having a positive.
>> COURTNEY RADSCH: What are users doing with those Articles?
>> So I actually I don't have the data to speak to but I do know there are positive effects from seeing that context in connection with those merging of articles in the news feed and that's something that we rely on our fact checkers for.
>> JEFF JARVIS: Just to give some context, remember news agencies haven't always been truthful themselves. You can photograph somebody in the war but that's really had only one news outlet kind of controlling information that could kind of adjust off. I do have a question that's kind of come up a bit with respect to reference to Vice and what is the two questions. What is the role of news media at large to avoid kind of going through scoops and breaking news and to take some time to verify their information before publishing it? And then second is to the point about content and volume of content, how does PolitiFact, how do you figure out which issues and articles you are going to take and analyze as opposed to all of the others which just don't have the resources?
>> COURTNEY RADSCH: Since those are new questions I am going to wait until the end and learn much more about journalistic practices. And I want to follow up because all of you raised the issue of the economic incentives. So my next question is does fact checking have an impact. And y'all said yes, and you pointed to the economic incentives. So do you think that we need to rethink do you think that we can address this issue of Fake News without addressing the economic incentives? And if not, what should we be doing about those?
>> JEFF JARVIS: So I want to set a broad this is my life. We blame the platforms for seeking attention, but the truth is in media we did that business model. And always with Donald Trump, you have less and Jeff Zucker said it would be good for their business. Attention based, volume based business will lead you that way. With high competition you get more speed, too. I'm not going to get rid of advertising. I'm not going to get rid of impressions because that's the journalists. We have to I push video publishers hard to go in to other business models as well around commerce, membership, events, services, and lots of other areas where you have a relationship with the public, a deeper relationship and not having fly bys.
>> I would respectfully disagree with what you are saying because I think, and it answers your question, too, what sets us apart from other types of entities that report on information and are looking for attention grabbing and so forth is what I said at the beginning which is our commitment to accuracy and verifiable facts to journalism and the fact that we made that commitment and whether you like what we are putting out there or not, make a commitment to that. And we follow through on that by taking down this misinformation. We correct ourselves and that's where we as journalists going to your question as the newspaper industry sets ourselves apart in that we do make those corrections if we find that any in the interest of seeking attention we were wrong.
>> JEFF JARVIS: Include all of that. However let's at least be honest with ourselves about what the business model drives us to do. Put our own ad on and we don't use journalistic resource to the best end because of our business model. Business model forces us to do more pages and more pages. And that's where we are now. We have to find ways to reduce ourselves from that prison.
>> COURTNEY RADSCH: Danielle, isn't your group talking to Facebook about or other platforms about more revenue sharing some of these
>> DANIELLE COFFEY: Oh, thank you.
>> DANIELLE COFFEY: That is not planned but I will take that. We would love the ability to have what we are pushing for actually in Congress is the ability to have safe harbor antitrust laws to collectively negotiate with the platforms to be able to frankly get a better deal on economic terms because we do believe having a conversation which has chilling effects because of the publisher's case when they are trying to negotiate with Apple to get a better deal because of what was going on with Amazon. And they were in a proverse ruling with the DOJ from having that conversation. Not get us a better deal but being the quality that we put out there in our content would help clean up the ecosystem and potentially solve a lot of the problems that we're talking about.
>> I think the PolitiFact story goes to some of this. PolitiFact started inside the Panther, a division of the newspaper and we got bigger and bigger. And we were spun off to the Tampa Bay Times. Now we are under the Pointer Institute where non profit news room and we have those diverse revenue streams that Jeff is talking about. We have online ads. We have a membership program. We have service contracts and we have some grants. I definitely get the sense from all of my newspaper journalists and online journalists that we are feeling our way out to a different feature but it feels like we are in process right now.
>> COURTNEY RADSCH: From an (inaudible) perspective I know you guys are supporting financially and through partnerships a lot of news organizations and news initiatives, et cetera. Where does the issue of purely profit sharing and looking at the ability of news organizations to compete in this space fit?
>> I am not the last person to speak on that but I believe that we very much started this effort and continue to focus on disrupting financial incentives for distributing misinformation and false news. So that very much motivates which we decided to promote sensational headlines and focused on links in our fact checking program because what you very often see is that there is sort of a click link and you click on it and you get all of those advertisements that pop up. You think you are looking at Meghan Markle and then you are on that environment and that's terrible.
>> COURTNEY RADSCH: Founder of Craig's List, journalism conference earlier this year gets up and said that I broke journalism in America because he devastated the advertising industry because he is now investing and supporting news.
>> Hi. Danielle, I wanted to maybe push back on what you said a little bit about journalists and newspapers collecting themselves and updating articles to make sure they stay true to fact. From my personal experience even recently a Washington Post article about like the Koch brothers and other things like that where corrections are made, but they are often not that clear if people have already taken the stories or rewritten them as Jeff says, those updates are not passed along. And then at the end of the article the correction is made to say this. To try and point out issues with news stories with articles like Youtube, for example, as soon as you click on Russia today or even a BBC video it says this is a state sponsored media outlet. Before you have seen the content you know who it is coming from and getting an issue in a story might be slightly different than that. I kind of feel that while obviously a lot of journalists have their ethics, journalism and news writing is a business and those business incentives do not like the fact if there is a correction to a news story, saying correction in the title or subtitle is going to undermine the ability for people to click on it because it is not trustworthy. Reforming antitrust to talk to social media platforms because of those business interests I would be worried that's what's going to take priority here, journalistic ethics or the ability to make people click on a news story despite the fact that it might need a lot of corrections to it and that would kind of concern me as a user. I feel like a platform is a better place to impartially moderate that without newspapers believe what they think is profitable for them.
>> Do you want to respond?
>> On the first one about corrections, that's definitely, you know that's food for thought. And on the second one as far as, you know, the ability to collectively negotiate and get a better deal, I think that the investigations that go in to a long story or covering a town hall that nobody is going to go in the local town or community or covering the Friday night game. I don't think that journalism at the local level is not going to be able to sustain itself if we don't reinvest ourselves.
>> It is interesting, journalists and news that provide reporting, as a journalist is having to get a correction. I think one of the things we don't maybe talk enough about in this debate are norms. How could norms be deployed to help combat the issue of Fake News online? One of the most embarrassing things I have done online in my life was inadvertently share an article from a satirical news site not realizing it was a satirical news site and having my friends tell me that and realizing how horrifying I shared Fake News. And this was before Fake News was a term. Do you think there is a role for norms? How can we deploy that in addition to community standards, regulation, whatever it is?
>> I absolutely think there is a role for norms. And part of the issue that we are at currently is that we are establishing new norms for the Internet. I don't think of myself as particularly old. But I remember before the Internet and my first job we did write our stories on computers, but then we take a disk and bring it to our editors. And when the Internet came out we would go to the Internet machine. So I just I think we feel like this technology has always been with us, but it is actually brand spanking new. And we really and on a scale of the printing press invention it is that big of a change in society. And we definitely are in this moment figuring out new norms. And we might all be dead by the time the new norms are established. It is that kind of moment of transition.
>> JEFF JARVIS: (Inaudible) and it took 150 years after the invention of the typewriter to invent a newspaper. This is very early. We are indeed negotiating norms. If I go back to my days as a reporter, if I made a mistake I wanted to hide under my desk. So the new norm that I learned from that new role there, that changes how we operate. And technologically I would wish on that moment when you share something and you say oh, shit, I got that wrong. I wish there was a structure at Twitter that would let you blast out. And so sorry, I made a mistake. We can help people to encourage that your behaviors on how we design a system.
>> COURTNEY RADSCH: Okay. Do we have anyone from the audience? Do we want to address this issue of journalistic quality? You had two questions earlier, but do you feel they were addressed?
>> (Off microphone).
>> ANGIE DROBNIC HOLAN: Right. We get this question all the time, how do we choose which facts to check. At PolitiFact we follow the day's headlines and we look for statements, whatever the story of the day that people might wonder is that true. And there is a lot of we get a lot of reader reaction on because the way our ratings work and we have scorecards for individuals, there are often claims of unfairness and bias. We have to remind them we are not social scientists. We are picking statements to choose based on their attention. In the Facebook tool there is a measure of popularity. Which hoaxes are gaining more attraction than others. And we can see if something seems like it is about to spike. We do look for things that it looks like a significant part of the audience has heard of or maybe about to hear of.
>> On this issue, I want to link these two questions how do you choose and you mentioned the Youtube labeling. An article based on the research and looking at how those labels were applied in about 30 countries, the ability to show what would they look like inside and outside the U.S. And what I found there is a bunch of public and state owned news organizations on Youtube that would not have that label. Others would have it. But not have it, for example. But it concerned us about that labeling, even media scholars cannot find out who owns a news organization and label it as state owned publicly criteria that they are using and how do they set that information and what is the signal of that from. So do you have a follow up question?
>> Just to build off the mechanism that you use to identify which areas, which articles you are going to investigate further. Jeff talked about the importance of promoting ideas. If I go to most any major news organization, Donald Trump is going to be out there. They are going to involve President Trump. And I think this gives kind of the impression of bias because all of you are spending all your time focusing on President Trump when, in fact, it happens to be the most popular stories. So is there a way to disabuse people of this notion that you are leaning one way or the other?
>> It is a great question because it goes to popularity metrics. But that's not how we make the decisions because we use our news judgment, we are consciously trying to fact check a diversity of political figures. Democrats are out of power right now, but we tried to keep up a steady stream of fact checks of Democrats. And I think this is where the human editorial judgment becomes so important. And I think it is always going to have a place on the Internet that here they are making a decision. You can use your ethical values to balance out a news report. So it is not just what's most clickable. So it is not just cat videos.
>> COURTNEY RADSCH: This is a great segway in to politics. You are saying take the long view. We have new communications technologies. Always some version of Fake News. See what happens in its early days. In the political sphere and again I want to emphasize that what happens when U.S. has global repercussions we are seeing in the political sphere that democracy and electro processes are being impacted by the Internet and by so called Fake News. And I raised, for example, the Philippines where, for example, Detray when mandate for mayor had a specific was doing some analysis for him. More about how Facebook has sent in advisors to work with a campaign, how is that being related to the Fake News problem which seems to be particularly prevalent during elections. We just heard, for example, that Basefork sent the first fact checker program to Mexico outside of the U.S. and Europe ahead of their elections because there was a huge issue of Fake News during those elections. Can you talk more about that?
>> Mexico is a great example. As we think about scaling our fact checking program we do consider whether there are upcoming elections because that tends to be a situation where we have seen spikes in this information on Facebook and obviously with potentially significant consequences. And so in Mexico we did we have partnered with fact checking organizations. And again it is sort of a delicate the relationship we have with our fact checkers is to defer to their judgment. We provide certain reading guidance and we don't we also defer to what they choose to fact check as we heard earlier. But I think that prioritizing certain countries that are they are facing upcoming elections, making sure there are fact checkers on the ground, knowing that having those fact checkers that they have our support has gone a long way. We saw a lot of success in Mexico.
>> COURTNEY RADSCH: How is that linked to the support that Facebook is giving to individual candidates?
>> I can't speak to that. I don't I don't know who to put in that. So I need to
>> COURTNEY RADSCH: Transparent. Even you guys need to understand how the organization works.
>> I think I don't know if I would say that. I would say that the skills are separate. And I would have to get back to you.
>> JEFF JARVIS: Long before advertising they would put in a creative change to improve the advertising platform. It is kind of like they have advertising. If we have gone back four years, five years, six and said gee, Facebook and Twitter and Google are going to help political campaigns. Either across campaigns and anybody with campaign, oh, good idea. And now however we see, now the notion that Facebook was embedded in Trump's campaign, sounds dastardly but I think the times have changed so much. We have to change our mind around how this is done. It is a rule set, should the platforms give any aid to any political campaign or not. It is a valid discussion to have. You got to be willing that times have changed.
>> When we take the Philippines as an example, Detray has deployed Fake News and controlling against the media outlet which is using Facebook and specifically against its editor and chief to legitimatize the news and must be using some of the techniques that he got ahold of during the consultation, you know, for politics.
>> JEFF JARVIS: Here is that's the hard part. So I can see on Facebook saying you know what we are not going to help any political campaign anywhere because that's Biaz. They need to get down to a race somewhere where the candidates could be far better talking to citizens on social media and get a sense and cut off an opportunity there. It is really hard to understand how to do these rule sets based on I would say fringe case, but the fringe cases are in charge of nations like Philippines.
>> COURTNEY RADSCH: Yeah. We have about 15 minutes left. And I have a million other things I want to ask them, but I also want to open it up here to anyone who has already asked a question or made a comment or if there is anything else from the audience.
>> JEFF JARVIS: Or has the floor.
>> COURTNEY RADSCH: I ask you to keep it a little bit short.
>> Hi. This is Stefan. I want to keep coming back to the question of fact checking and its effectiveness. How do you deal with the (inaudible) principle? Sorry. The bullshit piece principle by Alberto (inaudible), which states that it takes exponentially more time to debunk bullshit than to create new bullshit. (Off microphone). But how do you deal with that in fact form like Facebook? Can algorithms and AI help? Always addressing it after the fact which comes back to the fact checking and corrections after the fact of a news story which nobody ever reads.
>> COURTNEY RADSCH: There was a study in Mexico that they mentioned the fact checks travel 100 times slower than a news article.
>> I hate to sound pessimistic. So I am hoping that you can answer.
>> COURTNEY RADSCH: Sounds like that one was for Jes.
>> JESSICA LEINWARD: Yeah, I think it is for everyone. But I think time and latency is certainly an issue, particularly when dealing with virality. And I think explaining truth and proving exactly as you said, disproving something it takes time and being a source it takes time and effort. There are standards to adhere to. And so I think that that is a reality that we are grappling with and we are trying to think about ways to potentially compliment our fact checking program. We are not there yet and exploring different ways that technology may help. But they talked about this morning we are dealing with issues of content. Everything is so contextual. It is really difficult to rely on machine learning and algorithm we are trying. We are trying to use, for example, our fact checking rating that we get to inform some of our machine learning and scale it so that it is more effective and also broader. But again work in progress.
>> There was a study that got a lot of press recently. And it was saying oh, false news spreads faster on the Internet. When you look at the study what it showed was that when you look at the fact checks if something was fact checked as false, more people shared that. And if something was fact checked and found to be true. Now to me that's kind of a no duh finding because I find a lot of people on the Internet are eager to promote corrected information. I think the notion that fact checking doesn't matter or laws will always be faster I don't see the evidence really supports that. And I think it is kind of cynical and something that I hear people it is deplored in a lot of arguments like the term political. Really arrests about political power rather than accuracy or inaccuracy.
One other thing I wanted to mention a lot of the conversations today are focused on Facebook. But Google is very much active in the fact checking space. They have a project called Claim Review where they are providing code to fact checking organizations to use to demonstrate, to highlight fact checking content. And it is a really interesting project. It is open source. It is not proprietary. Facebook announced recently they were going to start using this claim reviewed data to adding to its misinformation project. So I do think that there are there are more coding projects than we can discuss today that are going on around accuracy.
>> Sure. Jes from the Charles Koch Institute. Very much appreciate the conversation. And I guess I wanted to ask, maybe to take a step back a little back from the market. If you look at the information that's provided via Google or Facebook or other major tech platforms most of it is not news. My understanding probably, most of the room and myself included are really nonrepresentative Facebook and Google users. And we see many more news stories in our field than others. Remarkably transparent piece awhile ago where they revealed that their traffics earlier this year basically fell off a cliff. Mostly because the shift in recommendations and maybe in the way that people are spending their time is on things that are not news.
And so I guess where I am going with this, especially when we hear proposals from like publishers to be able to conclude or other sort of like mandatory subsidies or things like it seems like there are lots of other possible ways in which users might spend time online. And so, you know, I get that there is a sort of valid debate between Fake News and valid news and ways in which people consume and ought to engage in that, but how much of a risk or how much attention should we pay to all the other things people might be doing with their time online?
>> I am going to take that. I think what I am hearing is that you are saying it is not as big of a deal because people don't want it as much as we might think we want it. Our or it is not as in demand is what you are saying. We are finding that because of the digital ecosystem news at least by our trends are more demand than ever and it is an exponential increase beyond any imaginable audience. It is more about the revenue and monetization that we want to collude about.
To the point about whether or not it is what the users on the platform, Facebook themselves, we met with them being a good digital partner. They said that 56% of the users want more local news. That's data that was shared by your folks in New York. If the users want more of it I guess what they are asking the platform to inject the algorithm or whatever might be the choice they do get more of it and that's what the evidence is actually showing.
>> I will answer this question. So more than a year ago I had someone from Facebook come to talk to a bunch of product development people in New York and the first question they asked is what's news. And the answer was regulatory. As he described it it was social content and public content and what I forgive me if I'm wrong, the public content was a big bucket with very little variation in it. Vladimir's Russian blog got him in trouble. Now Facebook is bringing back quality of news. But that raises a question, what's news, what's quality and so on. It is very difficult.
In the end I think what will happen, and this is why I am trying to help with this, we have to help them decide what's news and in the end if isn't continuing much that's going to be better. Facebook did announce that news is going from 5% to 4% of news feed. That's atypical. We see a lot more than 4% of news I'm sure. But to your point most people for most of their lives are sharing costs. And if that's all they do then we would be better off. Maybe that's okay. So I think that we tend to overestimate the proportions here in the population as a whole and what people see.
Another thing, whether Twitter or Facebook, I have trolls and but this notion we have in the media oh, my guy, this is so awful. Look at your feeds. Really kind of okay. 98% news and 98% crap. But I think we got to adjust our positions around what people are actually seeing and doing.
>> COURTNEY RADSCH: And what else is interesting to look at what is the impact on the news media to ideally are more fact based and providing more of a factual content and fact checking as well. If I remember correctly there were 11 countries where Facebook initially rolled out a new approach to leading the algorithm for news. And what we heard reports of were local news organizations that were devastated in terms of, you know, the number of clips they were getting. And therefore revenue they are getting and the salaries and money that they have to pay their staff. So it is an interesting I think challenge that it raises all that of this testing on, I don't know, how the algorithm is not affecting what users see, but affecting the industry which relies on a lot of parts on the platform.
So we have about five minutes left. And not seeing any other okay.
>> We have one more.
>> COURTNEY RADSCH: Ask the question. Excellent.
>> Sorry. I want to make sure all about the hypocrisy has become the American right. They are talking a lot about the first order of what the platform should be and recognize truth. Second order question of how we or the Government should look at how the platforms are doing that and how would you suggest in concrete terms, could you give us some examples of things that could be done to assure people that it is being done. To make sure I'm clear, I'm thinking of one very similar example of was Facebook noting that maybe it was Twitter, a particular piece of content doesn't get taken down if any one of the five fact checking services that they work with think that it is truthful content. So in other words, not one of them can veto, but they all have to agree. So I'm looking for examples like that and how we should think about that as a response to allegations that the platforms are serving the political agenda.
>> COURTNEY RADSCH: Take that one?
>> Okay. It sounds like you have the something to think about, maybe not quite ready.
>> Yeah. It is a question. I think I can take a stab at it. I can say from this just illustrating by example, one way that I think Facebook promoted trust and in the way you think about some of the issues it was by publishing more detailed standards in April. Really by showing the guidelines that our reviewers use to enforce against content that is perceived that is determined with the community standards. I feel like once people saw the level of stuff that's put in to it but also had a chance to grapple with what the rules are and can see and now could see content, you can look at what the roles are that govern it. They have a better understanding of where we draw lines and why we draw lines. And we also made our rationale for the line drawing a little bit more transparent. So me at least I have seen a difference based on that decision. So I guess what I am saying is that I think transparency is to our benefit.
>> Can I ask a question? But the Info Wars case went on where through a few discussions it didn't violate the standards. You know what the standards were that violated in that case?
>> I don't know the specific standards for each of the videos. I believe I don't want to speak potentially bullying and harassment but I'm not sure.
>> It any standards that are written down are clearer than the session and (inaudible) it is still hard to implement them.
>> Yeah, and I couldn't agree with that more because I'm regularly in touch with the platforms about journalists in news organizations around the world that have their content taken down in a range of countries. You go to Facebook or Google or Twitter and have them not be able to give information about why it was taken down because you are not that party, that party has asked you on their behalf because never in the history of my career have I ever met a journalist or a video station, have used an automated remedied approach. When we talk about transparency it is not hey, here are the rules over here but also throughout the process.
I think we need to wrap up. So I want to give every panelist one last chance to leave us with a key take away that the folks in this room are more engaged in this issue of Internet Governance and will remember this conversation.
>> Something I say before a lot of the audiences that I talk to which is that the issue of accurate information on the Internet is not going to be solved by one group. It is going to require the platforms. It is going to require journalists and Governments in some cases and going to require average citizens. Everybody has a part to play in making the Internet a place where people need accurate, credible information, not just lies.
>> I think I would just say that the I totally agree, issues are incredibly complex and each decision we make we are seeing has tradeoffs. So engaging in discussions like these, engaging with our stakeholders and getting feedback is just vital to the work that we do. So thank you for the discussion.
>> Going back to what I said in the beginning, we strive to provide real news and adhere to a code of conduct.
>> JEFF JARVIS: Sounds like (inaudible). I want to underscore what Angie said earlier, I believe these are very early days. We are just figuring this thing out. And I have watched a lot in Europe where I think the well intended regulation has had bad, unintended consequences again and again and again. My basic message is chill, we'll figure it out.
>> COURTNEY RADSCH: Remember that Fake News has real consequences and lands people in jail because of Fake News because of what has been published online. And we have seen the proliferation of Fake News around the world. It matters what we do and what the U.S. does is going to have global repercussions. I invite you to the next session which is the Plenary issue on GDPR are having international consequences. Thank you so much. I appreciate your participation. And thank you to the panel and everyone online.
This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.