Recap: Miami, FL Forum

Knight Commission on Trust, Media and Democracy
Miami, FL – Monday, February 19, 2018

Richard Adler

The Knight Commission on Trust, Media and Democracy held a workshop at the Marriott Marquis in Miami, Florida on the afternoon of Monday, February 19, 2018.  The first part of the afternoon was devoted to two panels of experts on media and technology and minority media, which were followed by Commission deliberations.  This is our rapporteur, Richard Adler’s, summary of the session. We encourage you to read the summary in its entirety. 

Tony Marx opened the session by noting that he is more concerned than he has ever been for the future of democracy.  In the wake of the shooting at Parkland High School, students are saying that politics is not working, and that they are not willing to wait for adults to figure out what to do to protect them.

The work of the Commission is especially important in this context.  He restated the four questions that he believes need to be answered (and reminded Commissioners to submit their questions if they have not already done so):

  1. How do we help people discern fact from fiction?
  2. How do we know what the source of information is; who is paying for ads?
  3. How do we encourage each other to get out of our bubbles? How can we mobilize technology to enable us to have conversations across political or ideological lines?
  4. How do we make sure local news continues to exist?
Panel I:  Media and Trust: What are the best practices for media organizations to follow to engender trust?

Jay Rosen, Professor of Journalism, NYU, author of PressThink blog.
Based on his long career in journalism, today is the darkest time in 30+ years for free press. News organizations need to accept that trust isn’t given automatically, but needs to be produced, like news itself. There is a set of practices to restore trust in journalism by making an opaque practice transparent. This is not how journalism is practiced today, but he is convinced that these practices will become standard procedures in five or ten or twenty-five years. Key principles:

  1. “Here’s where we’re coming from.”  The mainstream standard for news is professional objectivity – “the view from nowhere,” professional objectivity.  But there is something about the claim of objectivity that invites mistrust. The effort to increase diversity in the newsroom is contradictory to the concept of objectivity: the assumption behind the drive for diversity is that reporters from different backgrounds will bring different perspectives to the news. Yet, when they go to work, journalists quickly learn to remove their perspective.
  2. Maintain very high standards of verification. A commitment to verification is not new; it is the core of journalism. But rigorous fact-checking (“bullet proofing”) is essential to increasing trust.  Implies high standard for correcting mistakes.
  3. Provide a clear statement of journalistic priorities. News organizations should state publically what they are devoting resources to and why—a way to be explicit about an organization’s agenda.
  4. Establish a continuous improvement regime for listening to the public. Listening is so important that it has to be subject to continuous improvement. News organizations need to know what their readers are interested in, what they think of their work.  (An example: Hearken, which starts by asking readers for their interests.)

  5. Strive for excellence in seeking and responding to criticism. Related to listening. Be able to sort fair from unfair criticism. Example: ProPublica invites criticism of its stories, publishes it with a point-by-point rebuttal.
  6. Show your work. Don’t believe us, see for yourself. Here is our evidence. This should include: Here is how we do our journalism, how we work. Has not been a priority for journalists.
  7. Help us investigate. Join with us in investigating – provide data, look through documents. Example: Stephen Brill’s Time cover story on health care costs started with people sending him their hospital bills for him to investigate.
  8. Share what it costs. Let people understand how expensive it is to do serious, high quality investigative journalism.
  9. Practice constructive/solutions journalism. It is not enough to show problems; news stories need to explain what can be done about it. Example: what you can do to reduce your carbon footprint.
  10. Keep track of old stories. News is “one story after another.” Readers get frustrated when they hear about problems but never hear about them again. Need to track a moving story over time.

Journalism today is an opaque process; needs to be made transparent. Not everyone will want to know how journalism is actually done, but what is important to building trust is that the information is available.

Journalism “took a wrong turn” starting with Theodore White’s backstage accounts of presidential campaigns, Newsweek’s Conventional Wisdom feature, and daily and weekly talk shows: journalists began to focus political coverage on the inside game of professionals, consultants, etc.  The results were disastrous: “savvy journalism” invites readers to identify with politicians and treats voters as objects to be manipulated rather than as engaged citizens. Promotes the idea that you can cover politics without a point of view (“news from nowhere”).

Raney Aronson-Rath: This year, Frontline experimented with being transparent with a program about Vladimir Putin: They published all of the transcripts, about 70 minutes of additional material. They found that viewers spent 12-18 minutes reviewing this material. It required a “huge amount of work”: transparency is expensive.

Sara Lomax Reese, President and CEO, WURD Radio.
This may be a moment of national crisis, but in minority communities, trust crisis is not new.  She has spent 25 years creating content for African-American community. In 2010, she started running WURD as owner-operator. It is the only African-American talk station in Pennsylvania, and one of just three in the U.S.

Ownership matters for trust in minority communities. The 1996 Telecom Act, by eliminating ownership restrictions, eviscerated many small radio operators: iHeart Radio grew from 40 to 1,000+ stations. Philadelphia is 25% African-American, but WURD is the only talk radio station serving this community.

People are looking for authentic connections; they want to feel part of a community. WURD’s goal has been to create trust. They serve as a bridge to connect their community to resources and provide inspiration: they show they diversity, complexity, humanity of community.

To build trust:

  1. History matters: What is relationship to community? Be honest about it.
  2. Messengers matter: Need real diversity in newsroom. Need not just to hire minorities, but to empower them to speak honestly.
  3. Context matters: They can say things that resonate differently than if said in mass media. The Philadelphia Inquirer’s violence project told stories of people impacted by violence, included a story on fathers and sons who were both incarcerated. Even though it was well researched and well written, she was uncomfortable reading it. She was concerned it reinforces stereotypes.
  4. Got to genuinely love readers/listeners: Need to believe it will make a difference in their lives. Cultivating real trust takes patience, consistency, investment, deep understanding of historic content in which you are operating.

When asked if this model could scale, she responded that she believed that the model could scale but it is high-touch, making it challenging to replicate. Key is to build on people who have already gained trust within a specific community.

WURD also functions as a “release valve” for anger, especially for people who have been voiceless, have had no outlet. They don’t screen their calls: they need to know how to manage emotions without shutting them down. They recognize that some topics are “red meat,” but they cover a variety of topics. She has experimented with partnering with others (WHYY, Philadelphia Magazine) to link their communities.  Her station actually has many white listeners who are curious about what is happening in the African-American community

Mindy Marques, Executive Editor and Vice-President, Miami Herald
She oversees two newsrooms, the Miami Herald and the Spanish-language El Nuevo Herald.  She is always thinking about their audiences. Their readers get upset about national news, but not local news. Even when they are unhappy with national coverage, they value local coverage.

People are familiar with reading print newspapers, but are less familiar with online news. What is missing are clear “markers” – e.g., factual stories vs. opinion. Standards for print developed over a long time; now we need similar standards for online news.  Even in print, people don’t always understand columnists’ freedom to express their opinions.

Not all topics are equally popular with readers: they need to use the most popular content to subsidize less popular but important content.

Online offers opportunities for journalists to be more transparent, but comes with its own dangers: one of her reporters was using Twitter to seek information on victims of the Parkland shooting, but her messages were altered to make it appear she was looking for photos of dead bodies. Created a controversy that was hard to counter.  Even the actual tweets generated some negative reactions: showing how the sausage is made may not make all readers more comfortable.

This may be a golden age for national news coverage, but local and regional news is being decimated. They share the cost of a bureau in Tallahassee, but five reporters there are half of the number a few years ago.

McClatchey is attempting to refocus newsrooms on their core mission and their unique capabilities. They need to know much more about their readers, need better ways to get feedback.

Tim Marema, Editor, Daily Yonder
The Daily Yonder started in 2001 under sponsorship of the nonprofit Center for Rural Strategies to provide coverage of rural America. They are showing how local news can have national significance.

Trust is on the decline not just for journalism, but across institutions.  Journalism is in the middle, more an effect of declining trust than a cause.

The good news is that readers still trust local news more than national news. Much of local news is practical, not political (“how can I get my potholes fixed?).  There are still more weekly papers with fewer than 25,000 subscribers than dailies with more than 50,000 subscribers. Some organizations (Clear Channel, Sinclair) have figured out how to combine local with national content.  But most national media have lost touch with small communities

PANEL II: Conversations with the Commission: Algorithms, platforms and trust.

Tim O’Reilly, Founder and CEO, O’Reilly Media
What do people in Silicon Valley mean when they say, “they just build algorithms”? Is this arrogance or naiveté? No.

Facebook cannot manually manage two billion users who make seven billion posts every day. Google cannot handle billions of queries without automation.  The scale of the challenge means that they have to use automated processes – algorithms, which are just rules for sorting content (like a bank that can automatically sort coins of different denominations).

For example, an algorithm can review content for specific characteristics and flag those that could be problematic for human review. Their function is not to determine truth or falsity, but to spot content that is suspicious.  Goal of algorithms is to minimize bad results and to reduce the need for manual interventions. Increasingly, algorithms are being generated automatically by other algorithms, which get keep getting better by learning from their mistakes.

Much of what humans are doing can be done by algorithms. There are lots of markers than can be used to identify bots or fake news sources (e.g., the Denver Guardian that sounds authentic but isn’t).  Need to find algorithmic markers of trust, focus more on the needs of users than advertisers. Need to develop business models that are more reliant on user support than on advertising.

Craig Silverman, Media Editor, Buzzfeed
To explore the problem of bad information, here a case study of Native American information on Facebook (which also applies to other platforms). There is a massive number of groups with lots of Native American content. Much of it is plagiarized from authentic Native American sites, some of it is simply false click-bait.  The pages display many signals of authenticity: many fans, many (false) U.S. addresses, an emoji that resembles a validation check mark.  In fact, these sites are being created in places like Kosovo and Vietnam purely as a way to make money. They target Americans because they are more affluent, can generate more income. Unfortunately, these purveyors of fake content are highly skilled at gaming the system, beating the algorithms; their bad content is overwhelming authentic content.

Some lessons from this example:

  • The core element of human existence is sense making. But the amount of change in ecosystem is destroying our ability to make sense of what is true and what is not.
  • Much of the overseas involvement in American social media is politically and economically driven. We have an amazingly democratic system, but it is easy to exploit – very big, very algorithmically driven. Content coming from abroad is a form of globalization.
  • Any marker for authenticity can be faked/fabricated. Can buy fans. Set up a credible looking web site. New trust systems need to be hard to fabricate.
  • Old mantra was “trust but verify.” Maybe the new mantra should be “verify THEN trust.” But what toll is taken if the default attitude is suspicion? How many people are able or willing to adopt such a skeptically attitude to what they encounter online?

We are just beginning to explore the power of algorithms: BuzzFeed has created algorithms to analyze scoring in figure skating, match fixing in tennis. A team at ProPublica is building algorithms to investigate algorithms.  This kind of algorithmic journalism is happening but not yet on a big scale. Most newsrooms don’t have technical capabilities to do it.

Maria Elena Salinas, Independent journalist
She worked as a correspondent and an anchor for Univision for many years, covered nine elections. People in power always target media, but the attacks on media are now at a new level.  Powerful used to be afraid of media, would hide from media, or shut them down.  New way is to discredit the media.  How did fake news go from slogan to a real threat?

People have always been skeptical of politicians, but skepticism of media is more troubling.  How did it happen? Do media bear responsibility? Part of the answer is blurring the line between news and opinion. She once asked Lou Dobbs how he could be so critical of immigrants. He pointed out that the banner under his name said “news commentary.”

No sector of society been more impacted by Trump than the Hispanic community.  From the day of election, media been accused of lying, inciting fear and violence for reporting on what is happening to immigrants. We have turned into a society in which we believe the people we want to believe.

Jimmy Wales, Co-Founder, Wikipedia
Wikipedia operates with no algorithms: everything is done by a volunteer community of humans. It is the fifth most popular website; 1.3 million devices visit Wikipedia each month.

Fake news has had zero impact on Wikipedia.  Wikipedia isn’t perfect but it is beloved and trusted. Users know that they can get information that is thought about, debated. Core value: maintaining a neutral point of view.

How to build a community around news?  Journalists see themselves as separate from their audience.  Pure citizen journalism isn’t the solution. Citizens can’t do what professional journalists do – e.g., drop everything to pursue a story, know how to interview people.

The business model for news is broken – advertising model is part of the problem.  Good news: paid circulation for The New York Times, other publications is soaring.

His latest experiment is WikiTribune: It is a hybrid model with journalists and readers working together. The goal is to reduce the cost of doing quality journalism. Focus is on broad current affairs with supporters across English-speaking world, but he is hoping to launch a local pilot in one place.

Need to find a sustainable business model. WikiTribune was launched through crowdfunding and a grant from Google. Goal is to get regular monthly subscribers not donations.  He like Jeff Bezos, but is suspicious of the model of media subsidized by billionaires. Hopes that Jeff Bezos can make The Washington Post profitable.

Discussion 
Jimmy Wales stated that he was impressed that Buzzfeed is now doing investigative reporting, which is expensive. Why are they doing it? Craig Silverman explained that Buzzfeed started purely with entertainment, but noticed that when big news events happened, they couldn’t deal with them, became marginalized. If they want to deal with the stuff people care about and share, they had to deal with news. And they found that top-tier advertisers want to be associated with quality and that there is a halo effect from covering real news.

Humans vs. algorithms
Raney Aronson asked whether real investigative journalism – “corruption hunting” – could be done by an algorithm.  Tim O’Reilly replied that tech should be an ally of journalists. Much more can be done with data journalism. Can create “systems of accountability.”

Ethan Zuckerman: To produce quality content, you either need a lot of dedicated volunteers or sophisticated algorithms. He is much less sanguine than Tim O’Reilly about the potential of algorithms or Jimmy Wales about human contributions. Algorithms are imperfect and doing Wikipedia is expensive in time required from people. He is still deeply uncomfortable that platforms will become the arbiters of truth. If distrust is costly, who will pay?

Craig Silverman: he is a fan of the hybrid human/tech model. Over the past six months, platforms have drastically changed their message: they now acknowledge they are tilting the playing field, increasing human involvement in screening, etc.  Even with great tech, human oversight is vital.  It is risky to put too much responsibility on readers – they are likely to retreat into what they are comfortable with.

Tim O’Reilly: Is putting trust in platforms more risky than trusting three TV networks to provide all the news? It is possible to build a better media ecosystem, and we should aim to do so. But it is an ongoing battle. We are just starting to try!  Big tobacco and big oil only began to change after 50 years of denial about problems they created. We are just beginning to see the platforms attempt to change, to become more socially responsible.

John Thornton: he started working in non-profit journalism 10 years ago. He is tired of hearing that its funding model is a bridge to something else. Reality is that newsgathering has declined every year for 10 years. He doesn’t believe “we’re about to find something else.” He sees no evidence of a better model lurking out there. We are the ones who need to figure it out.

The paradox of trust
Craig Silverman:  The paradox of trust – the more you admit mistakes, the more you are worthy of trust. But now good faith mistakes are being weaponized. InfoWars criticized AP for saying that the Parkland Florida shooter was a member of a white supremacist group (which they quickly corrected), but never acknowledged their own misinformation. Nevertheless, the press can’t abandon making corrections.  “Need to merchandise corrections” – make it an asset.

Charlie Sykes: There is now an asymmetry in trust – news organizations correct mistakes, but we have an administration that never admits a mistake. Making a correction is seen as an admission of being “fake news.”

Maria Elena Salinas: Univision was highly trusted, even more than Catholic Church among Latinos.  Was personality driven.  With this election, there was real fear about the results.  Their audience said, “you lied to us” that Hillary would win. Trust in Univision has gone downhill since then, mistrust in media has deepened. Worried that when Trump is gone, we will still have a divided country.

Regulating algorithms
Nuala O’Connor: She is a devoted supporter of the First Amendment and Section 230.  But what about ethics of algorithms?  Inherent bias can be embedded in architecture. Are platforms algorithm editors or are they media companies?

Tim O’Reilly: Google is operating in the existing legal regime: But what is the ideal regulatory regime?  What would a regime look like that recognizes the editorial decisions that platforms are making?  Curation is different than authoring content.

Jimmy Wales: It’s OK that Google doesn’t make specific editorial decisions, but it’s not OK for them to say, “we don’t have any editorial policy.”  Example:  Wikipedia’s content was being massively cloned, and Google searches were sometimes pointing to these copies rather than to Wikipedia.  Google recognized the problem and created an algorithm that could identify the original source.

Anthea Watson Strong: she works at Facebook, used to work at Google. Her job is to build algorithms related to local news. Algorithms are a check on the power of platforms. Given the scale of the platforms, humans can’t make decisions on a one at a time basis. Journalists need to be more involved in conversation about algorithms. Need more dialog between platforms and journalists.

Jimmy Wales: Facebook’s situation is different than Google’s:  fake news is a big problem for Google, whose mission is “to provide quality content.”  But fake news isn’t necessarily a problem on Facebook if it is being shared by a friend for a good reason.  Facebook can’t interfere too much with choices made by friends without implying that “a lot of people have stupid, naïve friends.”

Tim O’Reilly: Some incredibly creative work is going on at Facebook on building markers for truth based on people’s networks. Still early in the process. Don’t kneecap the systems – don’t impose old rules on them. We will discover new rules.

Sean Gourley: We are already quite good at distinguishing humans from machines, good at detecting bots. But it isn’t being done because in an advertising-driven model, it has a real economic cost. Facebook revised its estimate of fraudulent users from 1% to 4%, which is the equivalent of losing $2.4 billion in revenue.

There are both positive and negative trends in technology. Negative trends include that opinion formation and propagation are highly vulnerable to hacking; artificial language generation can create many fake stories, which can be quickly A/B tested for effectiveness; and there is a massive misalignment of economics of platforms and social good.

Tim O’Reilly: The key to solving these problems is economic: follow the money.