Meta: Pro-Chinese influence operation was the largest in history
Meta said on Tuesday that the company has taken down what officials at the firm describe as the largest ever “cross-platform covert influence operation in the world,” which featured thousands of accounts pushing pro-Chinese messages across Facebook, Instagram and other online platforms.
The operation in question targeted audiences in Taiwan, the United States, Australia, the United Kingdom, Japan and the global Chinese-speaking population with messages across more than 50 platforms, ranging from the company formerly known as Twitter, to YouTube, TikTok, Medium and Meta-owned platforms.
Like other pro-Chinese information operations exposed in recent years, it failed to generate what Meta called “substantial engagement among authentic communities” targeted. Executives at Meta said the operation built on and shared technical infrastructure with previous pro-Chinese propaganda efforts but could not say definitively that it was operated by Chinese state agencies.
“This operation is large, prolific and persistent,” Ben Nimmo, Meta’s global threat intelligence lead, told reporters during a Monday briefing. “We expect them to keep on trying.”
The Chinese embassy in Washington, D.C., did not respond to a request for comment.
The operation disrupted by Meta — which the firm described in its quarterly Adversarial Threat Report published on Tuesday — featured posts and online comments containing pro-Chinese government themes, anti-U.S. messaging and attacks on critics of the Chinese government, including journalists and researchers. The content featured a variety of languages but mostly Chinese and English.
The campaign’s operators repeatedly shared identical content across several internet platforms, and their mistakes hint at the level of organization behind the effort. “Occasionally, fake accounts would post a comment together with what appears to have been a serial number, suggesting that it may have been copy-pasted from a numbered list,” Meta’s researchers wrote.
In total, the operation comprised 7,704 Facebook accounts, 954 pages, 15 groups and 15 Instagram accounts. Roughly 560,000 accounts followed one or more of the pages, but fewer than 10 accounts joined one or more of the Groups and only about 870 accounts followed one or more of the Instagram accounts, according to the researchers.
Meta found indications that the operation relied on common technical infrastructure and that the operators contracted to work the operation “appear to have been centrally provisioned with internet access and content directions.” The operators posted at times consistent with the Chinese workday, and Meta uncovered links to “individuals associated with Chinese law enforcement” as part of their investigation.
Company officials declined to elaborate on these links with Chinese law enforcement during a call with reporters ahead of the report’s release.
Meta researchers described the operation revealed on Tuesday as an extension of a pro-Chinese influence operation known as “Spamouflage” that dates to 2019.
“While this network’s activity on our platform mainly consisted of spammy sharing of links, in addition to memes and text posts, our investigation identified notable distinctive errors, behavioral patterns and operational structure that allowed us to connect it to a number of more complex and long-running large clusters of activity across the internet,” the researchers said.
In 2022, Google said it disrupted more than 50,000 “instances” of activity linked to the Spamouflage operation, which it calls “Dragonbridge.” Last year, Mandiant researchers said the Dragonbridge operation attempted to influence the 2022 U.S. midterm elections and spark protests in the United States against an Australian mining company’s American expansion plans.
The campaign has evolved since first emerging in 2019 and shows signs of emulating a suspected Russian influence operation known as “Secondary Infektion.” Both operations posted content to smaller platforms before attempting to share those links on larger platforms, and until the Spamouflage activity detailed on Tuesday, no campaign tracked by Meta had involved more platforms than Secondary Infektion.
Both operations featured messages in a range of languages and relied on obscure sites where false claims first would be posted, which would then be amplified on a larger platform and then commented on and shared by fake personas on yet more platforms. Spamouflage posted material to the same obscure forums targeted by Secondary Infektion that have not been used by other influence operations, Meta said.
There’s no indication of any coordination between the operators of the two distinct influence operations, and it’s likely more that Spamouflage is trying to copy its Russian predecessor. “Emulation is a good way to describe it,” Nimmo said.
Elsewhere in the world, Meta researchers also said on Tuesday that the pro-Russian disinformation operation known as Doppelganger that emerged in the wake of the Russian invasion of Ukraine continues to operate, including by running a network of sites spoofing legitimate news organizations. That operation is “the largest and most aggressively persistent covert influence operation from Russia that we’ve seen since 2017,” Meta said in its report.