Shared Intel Q&A: Foreign adversaries now using ‘troll factories’ to destroy trust in U.S. elections
2024-11-1 15:1:36 Author: securityboulevard.com(查看原文) 阅读量:2 收藏

By Byron V. Acohido

Foreign adversaries proactively interfering in U.S. presidential elections is nothing new.

It’s well-documented how Russian intelligence operatives proactively meddled with the U.S. presidential election in 2016 and technologists and regulators have been monitoring and developing measures to address election meddling by foreign adversaries, which now happens routinely.

They’re at it again. Russian actors “manufactured and amplified” a recent viral video that falsely showed a person tearing up ballots in Pennsylvania, the FBI and two other federal agencies recently disclosed. The FBI and officials from the Office of the Director of National Intelligence and the Cybersecurity and Infrastructure Security Agency said the U.S. intelligence community made the assessment based on available information and past activities from other Russian influence actors, including videos and disinformation efforts.

Now comes fresh evidence highlighting the nuances of social-media fueled disinformation of this moment —  leading up to the imminent 2024 U.S. presidential election.

AWS

AWS Hub

Analysts at Los Angeles-based Resecurity have been monitoring a rising wave of troll factories, fake accounts and strategic disinformation clearly aimed at swaying public opinion. This time around the overall thrust is not so much to champion Donald Trump or smear Kamala Harris, as it is to generally and deeply erode trust in time-honored democratic elections, says Shawn Loveland, Resecurity’s Chief Operating Officer (COO).

Towards this end, faked social media accounts impersonating both Trump and Harris, as well as prominent U.S. institutions, have been springing up and spilling forth outrageous falsehoods, especially via the Telegram anonymous messaging platform.

Telegram, it turns out, is a social media venue favored by disinformation spreaders. This popular cloud-based messaging app is known for its security features, flexibility and use across global audiences. Telegram’s minimal moderation makes it a haven for privacy-conscious users but also a perfect tool for spreading lies and conspiracy theories.

Last Watchdog engaged Loveland to drill down on what Resecurity’s analysts have been closely tracking. He recounted their observations about how now, more so than ever, social media apps have come to serve as “echo chambers.” This refers to how easily patrons become isolated within bubbles of half-truths and conspiracy theories that reinforce their biases.

Foreign adversaries are well aware of how echo chambers can be leveraged to manipulate large groups. They’ve seized upon this phenomenon to strategically sway public sentiment in support of their geopolitical gains. Disinformation spread through social media has been part and parcel of election interference all around the globe, not just in the U.S., for more quite some time now.

Election interference has become impactful enough, Loveland told me, to warrant stricter regulatory guard rails and wider use of advanced detection and deterrence technologies. Greater public awareness would help, of course. Here’s the gist of our exchange about all of this, edited for clarity and length.

LW: Can you frame how the social media ‘echo chamber’ phenomenon evolved?

Loveland: With the decline of traditional media consumption, many voters turn to social media for news and election updates. This shift drives more people to create accounts, particularly as they seek to engage with political content and discussions relevant to the elections.

Loveland

Foreign adversaries exploit this aspect, running influence campaigns to manipulate public opinion. To do that, they leverage accounts with monikers reflecting election sentiments and the names of political opponents to mislead voters. Such activity has been identified not only in social media networks with headquarters in the US, but also in foreign jurisdictions and alternative digital media channels.

The actors may operate in less moderated environments, leveraging foreign social media and resources, which are also read by a domestic audience, and the content from which could be easily distributed via mobile and email.

LW: Can you characterize why this is intensifying?

Loveland: Social media can create echo chambers where users are exposed primarily to information that reinforces their existing beliefs. This phenomenon can polarize public opinion, as individuals become less likely to encounter opposing viewpoints.

Such environments can intensify partisan divides and influence voter behavior by solidifying and reinforcing biases. For example, we identified several associated groups promoting the “echo” narrative – regardless of the group’s main profile. For example, a group that aims to support the Democratic Party contained content of an opposite and discrediting nature.

LW: Can you drill down a bit on recent iterations?

Loveland: We’ve identified several clusters of accounts with patterns of a ‘troll factory’ that promotes negative content against the U.S. and EU leadership via VK, Russia’s version of Facebook. These posts are written in various languages including French, Finnish, German, Dutch, and Italian. The content is mixed with geopolitical narratives of an antisemitic nature, which should violate the network’s existing Terms and Conditions.

The accounts remain active and constantly release updates, which may highlight the organized effort to produce such content and make it available online. In September the U.S. Department of Justice seized 32 domains tied to a Russian influence campaign. This was part of a $10 million scheme to create and distribute content to U.S. audiences with hidden Russian government messaging.

LW: Quite a high degree of coordination on the part of the adversaries.

Loveland: These operations are usually well-coordinated, with teams assigned to different tasks such as content creation, social media engagement, and monitoring public reactions. This strategic approach allows them to adapt quickly to changing circumstances and public sentiment. The content is often designed to evoke anger or fear, which can lead to increased sharing and engagement.

Troll factories often create numerous fake social media profiles to amplify their messages and engage with real users. This helps them appear more credible and increases their reach. Workers in these factories produce a variety of content crafted to provoke reactions, spread false narratives, or sow discord among different groups. They typically focus on specific demographics or political groups to maximize their impact. They may even use data analytics to identify vulnerable populations and tailor their messages accordingly.

LW: How difficult has it become to identify and deter these highly coordinated campaigns?

Loveland: Unfortunately, it is not always so obvious. Troll factories tend to push similar messages across multiple accounts. If you notice a coordinated effort to spread the same narrative or hashtags, it may indicate a troll operation. Accounts with a high number of followers but few follow-backs can indicate a bot or troll account, as they often seek to amplify their reach without engaging genuinely.

If the content shared by an account is mostly reposted or lacks originality, it may be part of a troll factory’s strategy to disseminate information without creating authentic engagement. Trolls often target divisive issues to provoke reactions. If an account consistently posts about hot-button topics without a nuanced perspective, it could be a sign of trolling activity.

There are various tools and algorithms designed to detect bot-like behavior and troll accounts. These can analyze patterns in posting frequency, engagement rates, and content similarity to identify potential trolls.

LW: Technologically speaking, is it possible to detect and shut down these accounts in an effective way?

Loveland: With GenAI, the creation of troll factories became much more advanced. Unfortunately, adversaries continue to evolutionize their tools, tactics and procedures (TTPs) – using mobile residential proxies, content generation algorithms, deep fakes to impersonate real personas, and even financing media distribution operations in the United States by hostile states.

LW: Strategically, why are foreign adversaries trying so hard to sow doubt about democratic elections?

Loveland: One of the foreign adversaries’ critical goals is to plant social polarization and distrust in electoral integrity. This is a crucial component of these campaigns. Often, these campaigns promote and discourage both candidates, as they do not intend to promote one candidate over the other. They plan to sow distrust in the election process and encourage animosity among the constituents of the losing candidate against the winning candidate and their supporters.

LW: No one can put the genie back in the bottle. What should we expect to come next, with respect to deepfakes and AI-driven misinformation, over the next two to five years?

Loveland: Foreign adversaries understand that the immediate goals in election interference cannot be easily achieved, as the U.S. Intelligence Community is working hard to counter this threat proactively. That’s why one of the main long-term goals for foreign adversaries is to create polarization in society and distrust in the electoral system in general, which may impact future generations of voters.

LW: Anything else you’d like to add?

Loveland: Our research highlights the difference between the right of any US person to express their own opinion, including satire on political topics, which the U.S. First Amendment protects, and the malicious activity of foreign actors funded by foreign governments to plant discrediting content and leverage manipulated media to undermine elections and disenfranchise voters.

For example, we’ve identified content cast as political satire that is also antisemitic and in support of geopolitical narratives beneficial to foreign states to discredit US foreign policy and elections. All postings were made by bots, not real people. The proliferation of deepfakes and similar content planted by foreign actors poses challenges to the functioning of democracies. Such communications can deprive the public of the accurate information it needs to make informed decisions in elections.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.


(LW provides consulting services to the vendors we cover.)

November 1st, 2024


文章来源: https://securityboulevard.com/2024/11/shared-intel-qa-foreign-adversaries-now-using-troll-factories-to-destroy-trust-in-u-s-elections/
如有侵权请联系:admin#unsafe.sh