News Literacy Project’s Peter Adams on how to fight fake news

ST. LOUIS — While living in a time where fake news has become normal, a necessary skill that citizens must have is media literacy.

As the featured speaker for Media Literacy week hosted by Gateway Media Literacy Partners, Peter Adams, senior vice-president of the News Literacy Project said not preparing students to navigate the complex information landscape is civically disempowering.

“Information is the basis for young people’s civic agency and civic empowerment,” Adams said. “It’s not optional.”

In his presentation, Adams said opportunities for media literacy learning pop up multiple times every day.

By using current popular fake news stories teachers can begin to learn how to spot fake news and evaluate user-generated content and standards-based news.

“We fall into traps if we approach everything with one approach,” Adams said.

Standards-based content vs user-generated content

Standards-based content, like journalism is based off of key questions and journalistic practices and standards. User-generated content is content that it can be created by anyone and follow no standards.

Adams said a feature used in user-generated content is the misinformation strategy called false context.

False context is when an authentic photo has been placed in a new context Adams said.

Adams’ example was former Washington Post journalist Jose Antonio Vargas’s tweet which read, “this is what happens when the government believes people are illegal,” with a photo of a child crying in a cage.

“Except that boy is part of a protest where kids were in a cage as a demonstration,” Adams said. “That image went viral out of context.”

Adams said as teachers use this kind of example, students can begin to ask questions such as: ”What can this tell us about the nature of human bias?”

“Vargas himself was detained as a youth. It struck a nerve with him and he shared it quickly,” Adams said.

Other user-generated content can be created and dispersed by a fake news generator, allowing people to choose a URL of their choice which is closely similar to an authentic news site, Adams said.

People can also use a fake trump tweet generator to create fake tweets under President Trump’s official twitter handle, Adams said.

Manipulated content is another misinformation strategy used in user-generated content.

Adams showed examples images where influential people had been edited into photos. Other examples had objects with text on them like shirts and signs which had the text changed to say something much different.

One example was an image of Aziz Ansari holding up a sign promoting voting through sending text messages to a specific phone number. Adams said this example was used for voter suppression because it implies falsely that people can have their vote counted through a text message.

Where does satire fit in a fake news environment?

“People say satire should be obvious and some things obviously are,” Adams said.

However there is content created by partisan trolls which is inflammatory by using manipulated content and false context. Then the trolls watch as people have authentically angry reactions in the comment feed, Adams said.

A study authored by Leo Stewart, Ahmer Arif, and Kate Starbird of University of Washington mapped fake news that was created by Russian trolls in 2016 and then dispersed by real twitter users. All the content used #BlackLivesMatter but was manipulated different ways to appeal to right and left-leaning twitter users.

The study illustration has two big blue masses created out of little blue dots signifying a single retweet. At the bottom corners of opposite sides in the blue masses orange lines are congregated in an orange mass.

“You can almost see the Russian trolls working in opposite directions. Their goal is to polarize and deepen divisions as much as possible,” Adams said.

The future of fake news technology is being created in computer algorithms that incorporate video footage of a public figure to recreate the public figure in another feed, Adams said.

New algorithms can also learn how to create convincing human faces that belong to no one.

How to combat fake news

It’s not possible to shield people from misinformation and fake news but Adams said  educating students how to properly evaluate content and detect fake news can turn back the dial on the effect of fake news and misinformation in a society.

To prevent the rapid spread of false information teachers can show students verification practices including reverse image search and hashtag searches that they can use on user-generated content to see what is being manipulated or represented falsely, Adams said.

“Information is the basis for a robust democracy, so to the extent that we’re news literate, consuming credible information and reading a variety of sources we are going to have a better understanding of social problems,” he said.

A better understanding of social problems leads to demanding better policy solutions, believing less manipulated information and ever-present propaganda.

Share our journalism