Facebook estimates that duplicate accounts may have represented about 10% of its monthly active users (MAUs) globally, for the quarter ended December 31, 2017 (Q4 2017). In other words, about 200 million out of the 2.13 billion MAUs Facebook reported at the end of Q4 2017 are possibly duplicate accounts. While false accounts accounted for about 3-4% of Facebook’s MAUs or about 85 million accounts.
It’s worth noting that Facebook says that the percentage of duplicate accounts is significantly higher in developing countries such as India, Indonesia and the Philippines, whereas false accounts originate more frequently from countries such as Indonesia, Turkey and Vietnam.
What is a duplicate or false account?
A duplicate account is one that a user maintains in addition to his or her principal account. We divide “false” accounts into two categories: (1) user-misclassified accounts, where users have created personal profiles for a business, organization, or non-human entity such as a pet (such entities are permitted on Facebook using a Page rather than a personal profile under our terms of service); and (2) undesirable accounts, which represent user profiles that we determine are intended to be used for purposes that violate our terms of service, such as spamming.
How does Facebook identify duplicate and false accounts?
- Duplicate accounts are identified via use of data signals like IP addresses and/or user names.
- False accounts are identified by looking “for names that appear to be fake or other behavior that appears inauthentic to the reviewers.”
Note that Facebook mentions that the scale at which the social media platform currently operates, identifying duplicate accounts, specifically, is very difficult.
During Facebook’s Q3 2017 earnings call, CEO Mark Zuckerberg had said that the company was building a new artificial intelligence (AI) tool for taking down “bad content and bad actors,” with a focus on fake accounts.
Countering Fake News
During the Q4 2017 earnings call, Zuckerberg said that countering fake news was another key area of focus for Facebook. He mentioned that Facebook “now has around 14,000 people working across community ops, online ops, and our security efforts” to identify such content. And that they’ve “made progress demoting false news in News Feed, which typically reduces an article’s traffic by 80% and destroys the economic incentives that most spammers and troll farms have to generate these false articles in the first place.”
Download: Annual Report