Meta Removes 63,000 Scam Accounts Linked to “Yahoo Boys”

Meta

Meta Platforms Inc., the parent company of Facebook, Instagram, and WhatsApp, has taken down 63,000 accounts connected to the notorious “Yahoo Boys” scam group, according to its Q1 2024 Adversarial Threat Report. These accounts were involved in financial sextortion scams and distributing blackmail scripts.

A smaller network of 2,500 accounts, linked to about 20 individuals, primarily targeted adult men in the United States using fake identities. Meta identified and disabled these accounts through advanced technical signals and comprehensive investigations, enhancing its automated detection systems.

“Financial sextortion is a borderless crime, fueled in recent years by the increased activity of Yahoo Boys, loosely organized cybercriminals operating largely out of Nigeria that specialize in different types of scams,” Meta stated. The company also removed Facebook accounts, pages, and groups run by Yahoo Boys, who were attempting to organize, recruit, and train new scammers.

During the investigation, Meta found that while most scam attempts were unsuccessful, some targeted minors, which were reported to the National Center for Missing and Exploited Children. Meta shared information with other tech companies via the Tech Coalition’s Lantern program to help curb these scams across platforms.

In addition to the removed accounts, Meta took down around 7,200 assets in Nigeria, including 1,300 Facebook accounts, 200 pages, and 5,700 groups offering scam-related resources. These assets provided scripts, guides for scams, and links to photo collections for creating fake accounts.

Meta's systems continue to block attempts from these groups to return, continually improving detection capabilities. The company works closely with law enforcement, supporting investigations and prosecutions, and alerting authorities to imminent threats.

Beyond account removal, Meta supports initiatives like Project Boost, training law enforcement worldwide in processing and acting on reports from the National Center for Missing and Exploited Children. Meta has conducted several training sessions, including in Nigeria and the Cote d’Ivoire.

To protect users, especially teens, Meta has implemented stricter messaging settings for users under 16 (under 18 in certain countries) and displays safety notices to encourage cautious online behavior.

In related news, Meta was fined $220 million by Nigeria’s Federal Competition and Consumer Protection Commission for multiple violations of data protection laws linked to WhatsApp. The investigation found Meta's privacy policies infringed on users’ rights, including unauthorized data sharing and discriminatory practices. Meta plans to appeal the decision, disagreeing with the findings and the imposed penalty. The FCCPC aims to ensure fair treatment of Nigerian users and compliance with local regulations.

Leave a Reply

Your email address will not be published. Required fields are marked *