Election campaign, Russian style. Political innuendo and Clausewitzian frictions. Algorithms and editors.
In one look.
- the Russian electorate (and Western complicity in the matter).
- Domestic political rotation.
- Algorithms designed to reward engagement don’t place much importance on truth.
- Separate editorial decisions from engagement?
Election campaign and voter suppression, Russian style.
Russia held Duma elections last weekend, and the United Russia Putin party retained its comfortable majority. The effective leader of the opposition to President Putin, Alexei Navalny, is in jail on a variety of charges ranging from fraud to extremism (outside observers generally consider the charges to be trumped up), but the Russian government is not not interested in seeing the opposition maintain an online presence either.
Setting a precedent that WIRED called “disturbing,” Apple and Google acceded to the Kremlin’s request to remove opposition “voting apps” prepared by Navalny’s Smart Voting project from their stores. The app in question was a voting guide, not a voting mechanism. “Created by associates of imprisoned opposition leader Alexei Navalny, it offered recommendations in each of Russia’s 225 constituencies for candidates with the best chance of defeating the dominant United Russia party in each race. ”
Radio Free Europe reported that Telegram did the same, blocking chat bots that Smart Voting had used to approve candidates. Telegram said it was following Russia’s “electoral silence” laws, which were portrayed as similar to laws in other countries that restrict various forms of campaigning during the elections themselves. (Here in Maryland, for example, it is illegal to button up people in line at polling stations. If you want to talk to them or hand them a flyer, you have to do it outside the parking lot or at a comparable distance.) But, according to Radio Free Europe, the founder of Telegram significantly said that developer teams like his had no choice but to follow the lead of Apple and Google.
The Atlantic Council sums it up as follows:
“The Russian government reacted to this voter guide as if it faced a serious threat to national security, a reaction that has sparked international controversy. The furious (and ultimately successful) efforts to suppress this voter guide not only demonstrate the determination of the Russian government to exercise extensive control over both Russian election results and the information Russian citizens can access. online, but also how the underlying dynamics of Russia’s censorship program can become an international problem, forcing companies based outside its borders to revel in internal repression.
Investigative misinformation in the US elections.
Whether Russian intelligence agencies were engaged in attempts to manipulate the US elections (or perhaps to breed mistrust along US social fault lines to the point that Russia’s main opponent would see its civil society significantly weakened) is not unquestionably serious. The details of these influence operations remain controversial. And it should come as no surprise that local political bigwigs certainly opportunistically profited from the antics of Cozy Bear and Fancy Bear.
Special Counsel John Durham, charged with investigating potential FBI misconduct in the 2016 election, secured the indictment of Michael Sussmann, a former federal prosecutor then working at the Party-linked law firm Perkins Coie Democrat, who presented information to the FBI alleging links between them. -the candidate Trump and a Russian bank, Alpha Bank. The indictment alleges that Mr. Sussmann lied to the FBI when he “falsely stated that he was not acting on behalf of any client,” which led the Bureau to understand that he was “passing on the information. allegations as a good citizen and not as a lawyer. for any customer. The indictment says Sussmann was billing the Clinton campaign for the time he spent researching the issue. He now faces a federal charge of making a false statement. Mr Sussmann’s lawyer said he was convinced it would be justified.
One of the attractions of running an influence campaign whose negative goals are to increase the adversary’s friction is the readiness with which the adversary will spontaneously cooperate in his own confusion.
ABC News describes Los Angeles County’s efforts to control misinformation and disinformation during the recent vote on whether to recall California Governor Gavin Newsom. This recall attempt failed and the governor remained in office. It will be interesting to see what lessons can be learned from the Los Angeles County experience. Their approach seems to have been traditional: reactive rumor control, adapted to social media.
The fault is not with our stars, but with ourselves (or at least with our algorithms).
The MIT Technology Review reports that Facebook’s engagement maximization algorithms automatically pushed inflammatory, often bogus, troll-grown content into US user news feeds during the 2020 election season, reaching up to one hundred and forty million individuals per month. An internal Facebook study concluded: “Instead of users choosing to receive content from these actors, it is our platform that chooses to give [these troll farms] a huge reach. The social network has sought to put in “safeguards” to prevent content from straying too far from an approximation of truth and normality, and it has continued its work against coordinated inauthenticity, but its own algorithms opposed it.
“This is not normal. This is not healthy,” wrote the report’s lead author Jeff Allen. “We have allowed inauthentic actors to accumulate huge following for largely unknown purposes … The fact that actors with possible connections to the IRA have access to a huge number of audiences in the same demographic groups targeted by the IRA poses a huge risk to the US 2020 election. ”Farmer Trolls, Russian , for the most part, followed the lead of the St. Petersburg-based Internet Research Agency (IRA) in their targeting, mainly prospecting “Christians, blacks and Native Americans.”
Social networks, news and disinformation.
A Pew Center study finds that 48% of American adults “often” or “sometimes” get their news from social media. It sounds high, but it’s actually about a five-point drop from last year’s results. Many have called for government regulation of social media in an attempt to keep them away from the misinformation that clutters these platforms (and hell with the First Amendment – it was just an 18th century document anyway. century, etc.). WIRED suggested a different approach: separating editorial decisions from business decisions, which his essayist plausibly suggests is the self-regulatory movement most responsible for moving an openly partisan press into the general sense of worthy objectivity. of confidence. Such standards have not yet caught on, in part because the commitment and how it is delivered is not fully understood. But if optimization algorithms could be kept away from formatting or presenting content, it could be a step in a positive direction. Every William Randolph Hearst deserves an Ambrose Bierce.