Ash Bhat and Rohan Phadte
Ash Bhat and Rohan Phadte, students at UC Berkeley, have developed a bot to filter fake news stories on Facebook, the first of its kind.

BERKELEY, Calif. (Diya TV) — Fake news has become a serious problem for Facebook in the past year. The rampant spread of misinformation on the site has become such an issue, the social media giant has itself encountered more than one problem trying to combat the matter.

After initially denying his company and its platform had become a harbor for fake news stories, Facebook chief executive Mark Zuckerberg publicly vowed in April to address the problem head-on. Zuckerberg’s solution? To introduce third-party fact-checkers to vet information and stories appearing on the site, allowing them to create labels to warn users of potential fake news stories.

So far, the label hasn’t served as much of a solution.

On the sidelines, two students from UC Berkeley have decided to take matters into their own hands.

Ash Bhat, 20, and Rohan Phadte, 19, both computer science majors, developed their own Facebook Messenger bot that, when fed a link, will decipher and tell you immediately whether an article is fake news, or isn’t. Named NewsBot, the duo developed the tool in only a matter of weeks, it is the only such tool of its kind in existence currently.

In addition to sussing out the validity of an article, it also offers a barometer showing whether the article is deemed biased toward the left or right.

In order to train the algorithm that the bot runs on they fed it over 10,000 articles from around the web. The first and largest batch of articles were sourced from sites that skew far on either end of the spectrum. To teach the algorithm to recognize right-leaning content they fed it thousands of articles from Breitbart, as well as articles from BlueDotDaily, to teach it to recognize the opposite.

While the two have recognized it as a useful tool, its development is not yet entirely complete. “We’re still making updates and changes almost every day,” Bhat said.

“We want to get people more informed and make decisions based on their views, as opposed to just being politically biased by what they read,” he added.

“Facebook should be proactive and make it visible that they’re fighting fake news. Right now you can mark an article as fake news from a small drop down at the top, but if you’re a user just scrolling, the feed hasn’t really changed in any way. Facebook owes it to news organizations to bring trust back,” he said.