By Briana Bierschbach, Catharine Richert
MPR News/90.1 FM
At first glance, the photo seems plausible.
U.S. Rep. Keith Ellison is shown standing near U.S. Sens. Amy Klobuchar and Tina Smith. They are all Minnesota Democrats on ballot this year, so it’s not out of the realm of possibility that they all might be seen together. The image, with text scrawled across it, shows them awkwardly posed in front of a homeless tent encampment in Minneapolis, campaigning at a “town hall” style event.
But the problem is, that event never happened. The photo is fake.
It’s a poor Photoshop attempt by an outside group, Right Now MN, to use misleading and fake information to influence the outcome of the election. In particular, the group is trying to tie Smith and Klobuchar to Ellison, who is facing an allegation of domestic abuse.
Using misinformation and propaganda is not a new tactic in campaigns, but it went to a whole new level with social media during the 2016 presidential campaign.
Those tactics are back in the 2018 midterm, so here’s look at who’s behind this kind of misinformation, what’s being done to tamp down on these efforts and how you can be a skeptic when fake content pops up in your feed.
What groups are trying to spread misinformation?
Here’s what we know: a man who leads a market research firm in Minnesota named Elliott Olson started the Right Now MN and also founded Right Now USA. A man named Louis Fors Hill — a descendant of railroad magnate James J. Hill — is bankrolling the group, and it’s spent more than $200,000 on website services provided by a company called 1854 Inc.
That company appears to have been created exclusively to support Right Now MN and Right Now USA’s websites. The group has also spent a fair amount on the services of an attorney named Richard Morgan who has supported Republicans in the past. And that’s where the paper trail ends.
And of course, there are plenty of groups outside of Minnesota trying to influence elections. A report issued by the FBI, CIA and the National Security Agency in January 2017 concluded that Russian operatives reporting to Putin interfered in the 2016 race to help Trump, and there are plenty of independent groups and actors as well.
So is this 2016 all over again?
The big difference from two years ago is back then groups targeted a single race: the presidential election. This year, groups are trying to play in the battle to control the U.S. House and Senate, and states with many competitive races are particularly vulnerable.
That includes Minnesota, which has four competitive congressional seats on the ballot, two U.S. Senate races and an open race for governor.
How are they sharing this information?
Fake content is almost exclusively shared and spread online, usually through social media platforms. According to the Brookings Institute, Facebook estimates 126 million of its users saw posts spread by Russian sources in 2016, while Twitter found 2,752 accounts established by Russian groups. Those groups tweeted roughly 1.4 million times two years ago.
What does fake content look like?
The Ellison ad is a great example of a fake meme, or an image with text across it often making references to something topical or in popular culture.
Visuals spread across the internet much quicker than text and are more likely to go viral. But there were plenty of fake news articles and headlines that spread across platforms like Facebook in the last election.
How can I be a more skeptical voter and spot this kind of fake campaign material?
To dig into the origins of the fake Ellison meme, MPR News did what’s called a “reverse Google image search.” It’s easy: Just right click on the image, save the file and upload that photo right into Google’s image search.
That will turn up any photos that used to create the new image.
In the case of the Ellison meme, the photo in the background was actually originally an MPR News photo used to cover the homeless encampment. There were no politicians in that original photo (MPR’s legal team has sent a cease and desist to the group).
Voter misinformation can also appear in the form of tweets or even headlines. Take a step back when you read something that sounds off. Try Google searching a headline that sounds suspicious and see if it pops up anywhere else. If it doesn’t, that probably means the article is fake.
And take a close look at the profiles of people posting suspicious-sounding tweets or other social media posts. Who are they? Are they journalists or other officials who would have access to such information? How many followers do they have? Are they verified users?
If not, be skeptical of anything they are posting. Those are likely bots, or fake accounts set up to disseminate fake information.