WhatsApp tries to snuff fake news ahead of Indian election

WhatsApp has launched a new fact-checking service for its Indian users.

Facebook and WhatsApp have been slow to react to the fake news crises that have spread across both networks. Recently, however, it has been encouraging to see WhatsApp begin kicking up operations to try and smother fake news. Limiting the amount of people users can forward messages to slows the wholesale spread of untrustworthy articles and labeling messages with the number of times they’ve been forwarded gives users more information about the messages they’re receiving.

A report by Reuters, however, shows that, quite rightly, the WhatsApp team are going even further in a bid to stamp out the spread of dangerous fake news across the network.

WhatsApp has launched a new fact-checking service for its Indian users

WhatsApp looks at fake news
The new WhatsApp helpline will help the company gain a better understanding of the fake news stories being spread via the app.

Reuters reports that WhatsApp has set up a new fact-checking line in India. The news comes at a crucial time as Indians are going into national elections later this month. With a population of well over a billion people, the national elections in India are the biggest democratic exercise on the planet, so it is clear to see why WhatsApp appears to be beefing up its fake news fighting operation in the build-up to such an important event. It is estimated that WhatsApp has well over 200 million users in India.

This new move sees WhatsApp teaming up with a local startup called Proto. According to a WhatsApp statement, users will be able to share messages with Proto so that the startup can verify their authenticity. With a user base of 200 million users in the grip of a fake news crisis, this is quite the remit for a local startup. In fact, since the WhatsApp statement announcing the partnership earlier this week, it has since come out that it is indeed too large a task for the Proto team to carry out.

A subsequently released FAQ page on the Proto website explains that not all users will receive a response if they send messages to the startup. The page says, “The Checkpoint tipline is primarily used to gather data for research and is not a helpline that will be able to provide a response to every user. The information provided by users helps us understand potential misinformation in a particular message, and when possible, we will send back a message to users.

The Proto partnership is more about learning about fake news than preventing its spread in the immediate-to-short term.

Screenshots of the new WhatsApp Checkpoint partnership

The FAQ page goes on to say, “We would like to verify every rumor, but we know that will not be possible given the diversity of information we will receive and the limitations of any verification research.” When both Reuters and Buzzfeed ran tests on the service, they confirmed that not all messages will get a response from the Proto team. The Reuters team had no response after two hours of waiting and the Buzzfeed staff reported no responses to messages even after a 24-hour period.

This new effort from WhatsApp has to be applauded, but again it seems like too little too late. If this method of collecting data proves efficient, the WhatsApp team will be able to learn a lot about the types of fake news stories people are sending via the messaging app. Unfortunately, this project will do little-to-nothing to prevent the spread of fake news during the campaigning season for the Indian national elections. As one TED research fellow who specializes in misinformation told Buzzfeed, “This should have happened years ago.

If you receive a message you believe to be fake, you can send it to the new fake news tipline by clicking here or sending it via WhatsApp to +91- 9643-000-888. You might not receive an answer on your particular message but you will help the long term battle against the dangerous spread of misinformation.

This Chrome extension wants to help you trust the news again

Nobias will help you determine if a news story is trustworthy and whether it has a political leaning.

We’re living in an age when it is harder than ever to know whether you can believe the news story you’re reading. This is about so much more than the “Fake News” the president talks of any time a critical story comes out about him or members of his administration. This is about stories that come to us via social media and other communication platforms like WhatsApp. There are now thousands of stories being shared on a daily basis that present false facts or some sort of political agenda.

Fake news on your laptop

With political chaos spreading around the world, it is clear that these polarizing fake news stories and one-sided opinion pieces are causing real problems. It is now more important than ever to know that the news you’re basing your opinions on is objective fact rather than subjective opinion.

Nobias will help you determine if a news story is trustworthy and whether it has a political leaning

It wasn’t too long ago when Microsoft introduced NewsGuard technology into the Microsoft Edge browser for mobile. NewsGuard automatically flags fake and untrustworthy sites so that you know the story you’re reading shouldn’t be trusted. Now, thanks to Nobias, Google Chrome users will have something similar.

Nobias is a free extension for Google Chrome. According to the Nobias homepage, the extension fits perfectly with the company’s mission of, “promoting responsible/inclusive technology to protect consumers from deceptive or misleading content on the internet.”

There is also an education element.  The Nobias site says, “We hope to help people understand the landscape of media bias and to give them the power over the algorithms that shape what they read and see online.” The extension then aims to show users a clear message about the story they’re currently reading while also showing how current technology is influencing the news stories they’re seeing more often.

The Nobias extension uses a machine learning algorithm to look out for certain keywords and phrases. It is then able to show whether there has been a political slant to the left or right written into the story. As well as this, the extension also gives each publisher and author a credibility rating so that readers know whether they can be trusted or not. Publications get a score from one to five, with one being the highest and writers are ranked on the credibility of their employer as well as whether they’ve won any prestigious journalism awards. For more information on how Nobias determines political bias, click here.

The nobias extension fights fake news

Nobias is currently available for all Chrome users. It can be added for free here. The team behind it is currently working on bringing extensions to other browsers too including Safari, Firefox, and Opera. With the Democratic primary already hotting up and Trump up for re-election next year, it could be a good time to start protecting yourself against the dangers of fake news.

Facebook under pressure to remove anti-vax groups

Why won’t Facebook take action against anti-vax groups? Here are some theories.

vaccines

Adding to the growing list of Facebook controversies, the company has been accused of promoting anti-vax groups. Anti-vaxxers are people who ignore the scientifically proven benefits of vaccines by consuming or spreading misinformation, intentionally or not. Through Facebook, anti-vaxxers have found their greatest opportunities to thrive and grow.

On Facebook, users can create groups around pretty much any topic they want. The admins and moderators of these groups can choose to make the group open (allowing anyone to join), closed (allowing only approved members to join), or secret (the group can only be found when someone inside the group invites you, and you still must get approved).

In closed and secret groups, aspiring members usually have to fill out a questionnaire before they’re allowed to join. If the moderators don’t like someone’s answer, they can decline them from entering the group. The vast majority of anti-vax groups on Facebook are closed, meaning that if moderators feel that someone’s going to enter the group with the intention of debating members or spreading pro-vaccine content, they’ll simply shut the door in their face. This leads anti-vax groups to become echo chambers, where the anti-vax narrative goes unquestioned and is fed into with further misinformation.

8

It should also be noted that many admins of these groups promote alternative medicine that they sell, encouraging users to buy their snake oil instead of using vaccines. This obvious manipulation, for financial gain or otherwise, takes advantage of people who don’t have the facts.

The World Health Organization named “vaccine hesitancy” one of its top 10 health concerns for 2019. Facebook does not seem to care, as its algorithms promote anti-vax groups and pages when the topic of vaccines is searched (this is presumably due to anti-vax groups having more activity, likes, and posts than pro-vaccine groups).

anti-vax pages

Additionally, Facebook’s targeted ad system sends anti-vax ads to users it identifies as anti-vax, further confirming their biases and keeping them tightly locked in an echo chamber. Facebook has long been criticized for allowing these misinformative ads to flourish, as these anti-vax ads join other harmful targeted ads that promote things like white supremacy and anti-Semitism.  For a company that has vowed since 2016 to remove fake news, bigotry, and intentional misinformation, Facebook has done a remarkably awful job. Controversial content generates a lot of buzz on Facebook, and the company doesn’t appear to care whether that content includes fake information or not, as it brings more activity to the site, which brings more advertising dollars.

In a statement released last week, Facebook said it is “exploring additional measures to best combat the problem.” Measures mentioned in the statement include “reducing or removing this type of content from recommendations, including Groups You Should Join, and demoting it in search results, while also ensuring that higher quality and more authoritative information is available.” Whether or not these are hollow words remains to be seen, but based on Facebook’s track record with promoting blatantly fake news and misinformation, it’s hard to believe they’ll follow through.