Combating Cyber Warfare

Below article come from a blog publication

For Russia especially, viral content has become a powerful weapon. In September, Twitter discovered 201 Russian-linked accounts dedicated to spreading fake outrage, while Facebook found about 500 accounts doing the same. These accounts pretended to be gun rights advocates and Black Lives Matter activists, taking up both sides of debates, with the primary goal to make noise. All together, the fake accounts on Facebook had been seen more than 10 million times, and that was just for sponsored content.

If fake news is meant to misinform people, fake fights are designed to divide and distract. By spreading outrage, Russian trolls are able to bury legitimate news while driving people further apart. The conflict helps countries effectively control their propaganda, a key strategy detailed in Russia’s military doctrine approved in 2014.

Cybersecurity is no longer about protecting our data from theft,” Rep. Don Beyer, a Democrat from Virginia, said during a hearing on cybersecurity last week.

“It’s also about defending our democracy from disinformation campaigns that combine cyber assaults with influence operations.”

Russian magazine RBC investigated a Russian trolling operation and found that it reached 30 million people a week on Facebook at the height of the 2016 US presidential election (the article is from a Russian magazine and has not been translated. An English summary is available here).

Here’s how Russian trolls used social media to effectively wreak havoc in the US.

Talk a-bot it

Going viral isn’t as simple as flipping a switch, but for Russian troll factories — with access to an army of bots on social media, it might as well be.

Ben Nimmo, a defense and international security analyst with the Atlantic Council’s Digital Forensic Research Lab, described the manufactured viral content as a three-step process.

“The goal of a propagandist is to spread your message, and the best way to do that is to get people to do it for you,” Nimmo said. “You can’t tell a million people what to do. You need to get 10 people, and they spread it amass.”

The campaign’s goal is to get the topic on the trending hashtags, which would mean their fake outrage has hit the mainstream.

Nimmo has been following the spread of fake news and trolls using bots to spread propaganda. Through all the campaigns, he’s spotted an attack with three stages:

  • Shepherd accounts are run by highly active and influential people and kick off a trending topic. But a shepherd account like @TEN_GOP, which was a Russian-backed account disguising itself as a conservative group in Tennessee, directs outrage over a specific issue trolls want to proliferate. @TEN_GOP had 115,000 followers and interacted with high-profile players such as former NSA adviser Michael Flynn and political consultant Roger Stone.
  • Sheepdog account follow and are also run by humans. They retweet the stories and add aggressive comments, to give the appearance that it’s a legitimate viral moment and not a fabricated trend.
  • Sheep accounts are bots that come in once the propaganda has settled, solely to puff up engagement with artificial retweets and likes. The thousands of unchecked retweets trick onlookers into believing the fabricated argument is a real issue, and soon enough, it becomes one.

The attacks are not always successful and Twitter’s gotten better at spotting bot campaigns. There’s always a thin line that trolls have to walk to make sure their campaign goes viral without getting caught.

“If you make it too many, you’re going to get spotted. If you make it too few, you won’t go viral,” Nimmo said.

Twitter said it’s had measures in place to prevent bots from cheating the Trending Topics list since 2014, citing a Sept. 28 blog post. It found an average of 130,000 shepherd accounts a day following the process that Nimmo discussed. Over the last year, its automated system caught 3.2 million suspicious accounts per week, a Twitter spokeswoman said.

But while it’s easy to spot bots, campaigns controlled by humans are much harder to spot.

As I wrote in my other article concerning cyber security, you have be vigilant or become a bot for a foreign country looking to use your web assets for their nefarious acts.

E. Bishop III, The Money Connection.Com

“Following the Money and Criminal Cyber Activity”

Leave a comment

Your email address will not be published. Required fields are marked *