“It’s so tough to tell a fake news story from a real one,” says David Fink, a journalist who has been covering fake news for years.

“It doesn’t just look like it’s real.

It actually makes you feel like you’re reading fake news.”

“I’m going to call the cops on this one.”

The Trump administration’s ongoing investigation into Russia’s interference in the 2016 election is drawing national attention to the extent to which social media platforms have become a battleground for disinformation campaigns.

While the Trump administration has sought to deny the president’s claim that the Russia probe is a “witch hunt” against him, the social media platform platforms are now under increasing pressure to provide more transparency and to stop propagating disinformation.

The stakes are high for companies like Facebook and Twitter, which have grown into a critical mass of news consumers.

The sites have amassed an enormous amount of data about how people interact with one another and how their data is used.

But in a country that’s used the internet to disseminate information in ways that are far less easily traced than those of the West, social media companies are facing a growing public demand for a fuller picture of how people are interacting.

“I’m gonna call the police on this a lot,” Fink told The Hill.

“Because the stakes are really high.”

Facebook has been the center of intense scrutiny for the last year, with a raft of investigations, a congressional probe, a special counsel, and an investigation into its business practices.

A House panel last month released a bipartisan report that found the social network had used artificial intelligence to boost news feeds in an effort to boost the number of likes and shares it received.

Fink was the lead investigator into the Russian election interference probe.

“There is no evidence of Russian government involvement in the election.

There is absolutely no evidence that this election was hacked,” Finks’ report said.

“The only evidence is the social-media accounts that were used.”

In his report, Fink wrote that social media’s algorithms and the tools they use to identify stories and posts, as well as the “unprecedented and pervasive” use of bots to amplify content, all suggest a degree of coordination between Russia and Trump’s campaign.

But Fink also wrote that there was evidence that the sites were deliberately manipulating their algorithms to push out stories that were more favorable to Trump.

Facebook, for its part, has faced criticism for its silence.

“Facebook has long had a reputation as a platform that amplifies content for political ends,” a spokesperson told The Washington Post.

“In 2017, this pattern of bias was even more evident with the election cycle.”

Facebook also acknowledged the issue of bias when it reported in June that the company had received “nearly 300,000 complaints” about stories it had published, including about the president.

The company said it had already suspended about 4,300 accounts and removed nearly 2,300 posts.

The Washington Post reported that Facebook is currently conducting a review of its algorithm.

“We’ve made some changes to the way we track and filter the stories we publish, and we’re taking steps to strengthen our systems to identify and address fake news before it spreads to the public,” the spokesperson said in a statement.

But Fink said he worries that the companies are not taking the issue seriously enough.

“I think the issue is not that Facebook’s algorithms are not fair, but that it’s just not taking them seriously,” he said.

The Washington Examiner, which has long been critical of the social networks, also reported in May that Facebook was “actively looking to improve its algorithms to spot fake news” and that Facebook had already “taken a number of steps” to address the problem.

In the wake of the Russia investigation, Facebook has been particularly active in fighting fake news, pushing out a series of new features aimed at curbing the spread of misinformation.

Last month, it announced that users could now “report fake news,” a move that has since been used by news outlets across the political spectrum.

But there’s little evidence that Facebook has taken action to combat fake news directly.

The social network has also been slow to remove links to fake news content, while it has recently taken steps to remove articles that are widely shared across social media.

“Facebook has an important role to play in countering misinformation and disinformation,” the company said in an April statement.

“But we have also made some improvements over the past year to make our reporting more transparent and to remove misinformation and false news that is not true.”

But Finkle said that the Trump-Russia probe is more than just an issue of fake news.

He said that Facebook and other social media sites could be “making money off fake news without actually doing any good.”

“The way they’re working now, they’re taking a percentage of every click on their sites, and if they have a few million dollars of revenue, that’s the end goal,” he told The Huffington Post.”What we