#Russian disinformation campaign
Explore tagged Tumblr posts
Text
[Dave Whamond]
* * * *
The Associated Press ran a story yesterday underscoring a facet of the presidential election that’s familiar to regular readers of the Editorial Board. While Donald Trump is popular with the Republican base, and while his nomination is all but assured, his weakness is increasingly on display.
The former president has galvanized support in Iowa, New Hampshire and South Carolina. Followers are “overwhelmingly white,�� over 50 and without college degrees. But this, the AP said, is very different from the US electorate. “He’d have to appeal to a far more diverse group and possibly win over supporters of former UN Ambassador Nikki Haley.”
Not only must he expand the base, the base itself has gotten smaller. The AP: “A large portion of Trump’s opposition within the Republican primaries is comprised of voters who abandoned him before this year. … At least 2 in 10 of the voters in South Carolina’s Republican primary and the Iowa caucuses said they won’t back Trump in November, while approximately 3 in 10 in New Hampshire felt that way (my italics).
But the more he accepts Russian aid, the more Russian aid will change the Republican Party, making it unrecognizable to actual principled conservatives and “that female independent suburban voter,” who are now leaving the party and making their way toward the Democrats.
Finally, there are the suburbs, “where the plurality of general election voters live,” the AP said. Voters there were not “particularly welcoming to him in this year’s GOP contests. He split the suburban vote with his opponents in Iowa and New Hampshire and won the suburbs in South Carolina by a smaller margin than in the state as a whole.”
This double trouble – the need for expanding the base even as it’s shrinking – has been understood among Republicans who are good at politics. (Trump is bad at politics.) This is why there’s a “growing chorus” of allies urging him to focus less on “grievances” and more on issues, or “hitting” Biden, to bring back GOP voters who’ve strayed.
But either he isn’t listening or he can’t help himself. (Actually, it’s both). Recently, Trump sat with Fox host Bret Baier, who asked: “What do you say to that female independent suburban voter to win her back?” Trump: “First of all, I won in 2020 by a lot. Let’s get that straight.”
While that clearly works with most Republicans, it doesn’t work with all, because, to these holdouts, it sounds like he’s “talking about being a victim,” as Haley said last week. “At no point has he ever talked about the American people. All he’s doing is talking about himself. And that’s the problem — it’s not about him. It’s about the American people.”
But even if Trump were to stick with campaign issues, like the “crisis at the border,” there’s a problem. He doesn’t care about them strongly enough to remain focused on them. He cares about defeating his enemies, to be sure, especially Joe Biden, whose administration is “victimizing” him. But actual problems? No. He doesn’t care. Problems are for exploiting to gain power to liquidate enemies, not for solving.
Republicans who are good at politics, and who are urging him to “pivot” to the general, also know who their party’s presumptive nominee is. He can’t adjust, won’t adjust. He’s running the same campaign in 2024 that he ran in 2016, only it’s a thousand times more dictatorial. Even when coaxed oh-so-gently into adjusting, Trump reverts to form. During a recent interview involving a question about government secrets found at Mar-a-Lago, Fox’s Sean Hannity suggested that he wouldn’t really commit crimes, would he? In essence, Trump said, ‘Sure, I would.’”
All the above is why Trump needs the Russians. His base is shrinking. He’s alienating “that female independent suburban voter.” He can’t adjust, won’t adjust. He doesn’t care about problems enough to solve them. All he can do – the one thing he’s good at – is hold steady while the Russians, in coordination with the Republicans, attack Joe Biden, smear him, wear his people down, “flood the zone with shit,” as advisor Steven Bannon once said, in a nihilistic act of “voter suppression.”
Indeed, as MSNBC’s Rachel Maddow pointed out recently, 2024 will be the third election cycle in a row in which Vladimir Putin’s saboteurs are once against attacking American sovereignty and interfering with our democratic decision-making, Meanwhile, GOP leaders will either stand by and watch it happen (as Senate Majority Leader Mitch McConnell did in 2016) or participate (as the House Republicans are now by pursuing an impeachment inquiry against the president, or as Joe Conason put it, by pursuing “a bogus case invented in Moscow.”)
[John Stoehr]
#The Editorial Board#John Stoehr#election 2024#Dave Whamond#editorial cartoon#Russian disinformation campaign#TFG
1 note
·
View note
Text
You know, it's interesting to me that I saw an article as I was scrolling through my dash this morning that (supposedly) blames the U.S. for being deeply involved in a genocide in Sudan. You might think from such a description that we'd be talking about U.S. military aid or boots on the ground or the CIA or something like that, and not just the Trump administration tanking our diplomatic efforts and Biden's administration not making the best decisions to right the ship. You might also think that such an article would not include a section like this:
David Satterfield, who replaced Feltman as US special envoy to the Horn of Africa and who has since resigned, said that Washington did not have anything but bad choices in Sudan, and therefore had to strike deals with the Sudanese military. According to Satterfield, “If there is ever an opportunity to return to a path towards restoration of a civilian-led government, you’re going to have to talk to the military then as well.
You also might not think that such an article would outright reference Russian involvement in Sudan, which it does.
Russia believes that its strong presence in Sudan will augment its status in Africa and the Middle East, which is considered an American redoubt. Since 2014, and with Moscow’s aspirations to exploit African mineral riches, the Kremlin has strengthened its ties to Sudan in order to ameliorate western sanctions following its invasion of Crimea, sanctions that became even harsher after its invasion of Ukraine in February 2022.16 In 2017, former Sudanese President al-Bashir visited Russia and met with Russian President Vladimir Putin. The two countries agreed to establish a holding company run by the paramilitary Wagner Group to mine gold ore. Russia also signed a 25-year lease in December 2020 to build a military base at Port Sudan on the Red Sea that can receive nuclear-powered ships. It was also interesting that Hemedti headed an official delegation to Moscow on the eve of the Russian invasion of Ukraine.
And while we're talking about Russian involvement in Sudan, which is why I'm here in the first place, it's really really really interesting to me that this article was phrased as proof that the U.S. was heavily involved in genocide in Sudan, despite the fact that the Russian Wagner group (accused of war crimes in Ukraine) has been providing missiles and military training to the Sudanese paramilitary group RSF while smuggling gold out of Sudan to fund their own activities in Ukraine. Fun fact about the Wagner group: They're also heavily involved in social media misinformation campaigns.
Wasn't there a Russian misinformation campaign on tumblr leading up to the Presidential elections in 2016?
And despite the "mysterious" death of Yevgeny Prigozhin in a plane crash (a short time after his aborted march on Moscow), Russia is still working on bringing the Wagner organization back under their control. Because, you know, they still have that whole invasion of Ukraine they're working on. An invasion of Ukraine that would sure be a whole lot easier for them if they could convince Americans to stop providing military support to Ukraine. They're already doing pretty nicely with the Republican party, but the Democrats (and the American left in general) have been harder to get on-side.
It does kind of feel like tying American military involvement in other countries to active genocide would be a great way to discourage people on the American left from supporting continued involvement in Ukraine, wouldn't it?
We're slightly less than a year away from the next American presidential election. There is no reason to believe that the Russian propaganda machine, which has already been operating at full blast since Russia's full-scale invasion of Ukraine in February of 2022, is going to slow down. This quote from the linked article is particularly chilling:
A particular challenge is that people tend to spread falsehoods “farther, faster, deeper, and more broadly than the truth”; this is particularly the case for false political news (Vosoughi, Roy and and Aral, 2018[7]). For example, one study found that tweets containing false information were 70% more likely to be retweeted than accurate tweets (Brown, 2020[8]). Another study found that false information on Facebook attracts six times more engagement than factual posts (Edelson, 2021[9]). In addition, feedback loops between the platforms and traditional media can serve to further amplify disinformation, magnifying the risk that disinformation can be used to deliberately influence public conversations, as well as confuse and discourage the public.
I think it's important to remember, especially now, that we are capable of spreading misinformation. The article about U.S. involvement in Sudan wasn't placed on there by an algorithm. This is fucking tumblr. That was one of my mutuals. Because they're concerned about American military intervention and they're against genocide and it sounded bad and they were upset and they didn't think to read the article. Because they didn't spend the time of Prigozhin's march on Moscow mainlining information on the Wagner group the way that I did, so they didn't go "Hey, Sudan? Wait a minute --" the way I did. Because misinformation that isn't targeted at your group is designed to be easy to spot, so you'll think that the misinformation that is targeted to your group will also be easy to spot, and it fucking isn't.
Because this culture of "If you care, you'll share" has gotten people to click that reblog button without thinking twice about it.
Don't keep falling for it. You don't have to spend an hour digging up sources and pulling out quotes for a ten-note post the way that I did. I'm like this as a human. It's fine if you're not. But if you're not even going to click the link to read the article and actually read it critically (or if there's no sources at all except a twitter screenshot, which I've also seen quite a bit of), then don't reblog it. Save it as a draft for when you have time to do the research, or just don't do anything with it at all. You're not obligated.
And if you have the relevant background to spot the disinfo, I mean -- again, look, you're not obligated to take that hour and search those sources. Even I don't do this all the time. It's hard, it's frustrating, and it will not spread the way the disinfo does. I'm gonna see that genocide post like five times at least on my dash, and I'm probably going to see it at least once from someone who has at least liked this post (if not reblogged it as well). But if you can. If you have the energy and the time. Try to put a little info out there. It might help someone.
That's all. Be good. Be skeptical.
#disinformation#misinformation#russia#wagner group#sudan#ukraine#like i don't have mad love for the us military industrial complex either#but that doesn't mean i'm going to blindly support russian misinfo campaigns designed at distracting from their own atrocities#we can suck and they can suck we are both capable of sucking#but the us military is under the control of biden and russia's is under the control of putin and they're honestly not comparable right now#like it's not even close#so try not to fall for it#we all fail at that sometimes but try
34 notes
·
View notes
Text
I'm going to be real I don't have any doubts that there are trans people on the tumblr/auttomatic staff who sincerely want to improve circumstances for trans users on this site but I'm really not going to care about apologetic gestures until there is material improvement. I won't trust vague assurances of improving anti-harassment measures until I see it. It's not like any of these patterns of abusive moderation against the most vulnerable demographics is new.
#Sorry I understand corpo speak too and the comments on that post are VERY generous in their interpretation of it lol#They would probably have to say something either way given the scale of this particular wave of outrage and AGAIN I fully believe#there's sincerity involved and it IS very unusual to so transparently condemn a CEOs behavior but#Yeah this is crisis management bottom line and I'll believe in management as a whole's commitment to ideals when I see it#Also the lack of acknowledgment that this particular wave is disproportionately affecting transfem users is at least a little weird#And again there's such a long fucking history of this#I don't remember a statement like this being made after the outcry about the Ferguson tag being blocked and waves of black users being#banned for no reason (or because they were labeled as part of a 'Russian disinformation campaign' with zero evidence)
12 notes
·
View notes
Text
National Unity Day
Monterey, California, 4 November 2024. Photo: The Russian Reader I’m worried about the left’s demonization of America’s origins and the future of Western civilization, as many conservatives feel that the basic tenets of society as we’ve known it are under attack. Source: Scott Jennings, “Opinion: Why I’m voting for Donald Trump,” Los Angeles Times, 1 November 2024 Carolina Performing Arts,…
#2024 US presidential election#Kharkiv Human Rights Protection Group#National Unity Day (November 4)#New York Times#News from Ukraine Bulletin#Omar (opera)#PS Lab (Public Sociology Laboratory)#Rhiannon Giddens#Russian disinformation campaigns#Russian election interference#Russian invasion of Ukraine#WIRED
0 notes
Note
do you unironically believe russian psyops are a real thing. be serious.
And to be clear: Every government engages in psyops. Psyops is literally a term that comes from our own government's actions.
3K notes
·
View notes
Text
Things going on behind the curtains...
#ukraine#successful attacks on the russian rear#russians resorting to massive disinformation campaign
0 notes
Text
Ukraine Donation Guide Master Post
(Ver. 2 updated Aug 13th, 2024) I will be reformatting this and adding more in the future when I have time.
Also a quick note, all of the groups I have found through twitter have been around long enough for them to be vetted by each other and the brigades they work with. In fact, a lot of these groups collaborate with each other too. Those that are in the fight for Ukraine have been diligent in calling out those that are grifters. Word spreads around quickly if an organization doesn't show up with what they promised. They also use their social media (often Twitter) as a means of transparency for their work.
Remember: When considering on whether to donate, always use your best judgement and donate to those you trust if you do not see what is listed is up to your standards.
Multi-Purpose
United 24 has various fundraisers dedicated to defense and drones, medical aid, rebuilding Ukraine, humanitarian demining, and science and education. You can pick which one you want to contribute to under their various projects.
Liberty Ukraine uses funds for humanitarian aid, medical supplies, protective gear and equipment, and rehabilitation therapy. You can choose which campaign of theirs to donate to.
Come Back Alive is a charitable foundation that supports Ukraine's military with competent assistance while also focusing on security and defense. They also have projects that use sports to help veterans rehabilitate. You can choose which campaign to donate to.
Serhiy Prytula Charity Foundation works to help both civilians and Ukraine's army. You can choose to donate to an active project or any of their general campaigns. Civilian aid campaigns cover temporary housing, supporting crisis and emergency responses, schools, demining, and healthcare. Military aid campaigns cover drones, optics units, communications equipment, and support of air defense teams.
Food Aid
World Central Kitchen works with local partners wherever they are providing food aid. They make sure meals and meal kits are what the local population eats. Even though there is no separate fundraising campaign for Ukraine (that I can see), they still do great work.
Animal Rescue
Hachiko Foundation works to help displaced pets and strays in frontline areas. They help with veterinary care, outdoor shelters, setting up feeding stations, and rehoming animals.
Medical Aid
Hospitallers (Website) is a volunteer organization of paramedics that was founded in 2014. They evacuate the wounded, provide medical aid on the frontlines, assist in rehabilitation, and transfer of the deceased to burial sites. They are also supported by Ukraine Charity. Visit Hospitallers' website to see how many they have evacuated, different methods you can donate, and more information about them.
Other
Saint Javelin (Twitter; Website) is a great place to get apparel, gear, and other cool loot to show your support for Ukraine. They don't take donations, but instead raise funds through their shop with a portion of their sales going towards humanitarian aid and critical items needed by the defenders (generators, pick-up trucks, medical supplies etc). Part of their shop has items made in Ukraine to support Ukrainian businesses. Overall, their products are high-quality. I include them due to their impactful presence in the Twitter community I follow and how they make Ukraine visible in an alternative way. Consider buying someone a gift from their shop.
The Kyiv Independent (Twitter; Website) is a great English language resource for news about Ukraine. I include them because I think supporting good journalism is incredibly important, especially now when the information space is fraught with Russian propaganda, misinformation, and disinformation. My followers have probably noticed I've pulled a lot of quotes from their stories in an effort to amplify Ukrainian voices and experiences. Look on their website for more information on different way to support them, such as their Patreon.
---
If you're on twitter, there are a number of groups and people that fundraise for Ukraine and for specific units fighting on the frontlines. If there is no official website, a PayPal for donations is listed in their profiles. When considering on whether to donate, always use your best judgement and donate to those you trust if you do not see what is listed is up to your standards.
@/Teoyaomiquu almost always has a fundraiser for Liberty Ukraine with a specified purpose. At the time of writing this, he is currently raising funds for engineering equipment such as excavators. One such excavator is already in Kursk. Follow him to stay up to date with what he's fundraising for.
Dyga's Paw (Twitter: @/dzygaspaw) is a smaller group that has recently raised funds for starlinks, drones, batteries, and Ecoflow generators. You can look at the fundraising campaigns they currently have on their website.
@/DefactoHumanity represents and founded Planet of the People with their website U(a)nited for Freedom. She frequently posts updates about their fundraisers and what their partners need. They are known for providing Frontline medical aid supplies, protective equipment and other military aid, technical equipment (starlinks, drones, scopes, etc), and infrastructure equipment (generators, vehicles, power stations, etc). They even have a merch store of the battalions they partner with if that's your jam. Here is their link tree if you wish to explore more. And in case you're curious, there is an article bout the founder here.
@/wilendhornets (Website) specialize in making high quality drones that have gotten a lot of praise from Ukraine's army. They have attracted a lot of media attention too. Check out their website for the list of articles that have been written about them. Their Twitter is very active with strike footage.
Ants Kitchen Hub (@/ants_kyiv) is a volunteer kitchen that makes dry rations for the Ukrainian army. They are more active on their other social media. To learn more about them, check out their link tree.
@/frontlinekit (Front Line Kitchen) is represented by Richard Woodruff. Originally they made shelf stable food for the Ukrainian army, but now their fundraising has branched out to other campaigns such as raising funds for medical supplies and drones. They are a well known group that many battalions have come to for help.
@/bekamaciorowski (Rebekah Maciorowski) is as combat medic and nurse who helps provide medical care to soldiers and civillians at the frontlines. She raises funds for medical supplies and other equipment, but also helps train soldiers in first aid. More of her social media that features her work can be found in her link tree.
@/UkraineAidOps (Website) is another organization battalions frequently go to for help. They fundraise for all sorts of equipment from medical supplies to drones. If you're interested, they also have a shop with patches from different brigades and flags signed by soldiers. Their shop also includes a separate section called the Victory Gallery where artifacts from the war are turned into art. This includes shells that are painted on, scrap metal from downed enemy planes are turned into keychains, and pieces of a rocket are turned into lamps.
Chris Garrett is the co-founder of Prevail. His organization deals with humanitarian demining as well as training for trauma care, training of bomb disposal, and education to the public. Prevail works with local agencies in Ukraine as well as the army.
Project Konstantin (Twitter; Website; Linktree) is still going strong after the death of their founder, British paramedic Peter Fouché. His digital ghost can be found here. They collaborate with the military, thus giving them an insight into what is dearly needed. They often raise funds for starlinks, personalized first aid kits (IFAKs), generators, portable power stations, and other nonlethal military equipment. I regret forgetting them the first time this post went around. Visit their website to see everything they have done and more. It has more information on what and how they do it than this post can cover.
One Team One Fight (Twitter; Website; Linktree) has some of the original members that worked for Ukraine Aid Ops. They formed their own group after differences with the previous one, and are still helping Ukraine. They are very visible on various social media showing what they have accomplished in their deliveries to various brigades. They're another group that seeks to bring starlinks, drones, medical supplies and protective gear to the battalions that come to them for help. Check out their website for more information on their current fundraisers, their achievements, and received recognition.
NAFO 69th Sniffing Brigade (Twitter; Website) Another small group that focuses their funds on delivering drones, generators, vehicles, and saving the occasional furry companion. They are very diligent in their updates for their fundraising campaigns. Check out their website for more information and the articles written about them.
Postmaster General Boomer (Twitter; Website) focuses on humanitarian aid, animal aid, and logistics. Boomer is the beloved pet of one of the founders and the secret boss/mascot. They have many transparency reports and are diligent in reporting the various "tours" they do in getting supplies where they are needed to go. They are based in Germany but have built up many connections during their existence. They have also worked closely with Ukraine Aid Ops.
--
I am sure I have forgotten some, so please reply or comment with any more I should add to this master post. I will edit and update as I see and evaluate more.
Last updated: Aug. 13th, 2024
Version updates listed below
August 13th, 2024 Added:
Hospitallers
Saint Javelin
The Kyiv Independent
Project Konstantin
1 Team 1 Fight
NAFO 69th Sniffing Brigade
Post Master General Boomer
2K notes
·
View notes
Text
Honestly, between the Republicans' deliberate Russian-disinformation smear campaigns, and the Online Leftists' deliberate just-flat-out-stupid smear campaigns, it makes me want to get a big Biden 2024 flag and wave it overhead like a Trump Cultist and yell about how he's the greatest president ever. Because I want to be nuanced and mature and reasonable about his record, what he's done well and what remains to be improved, and otherwise treat it like a sensible adult human historian person, but my god, y'all make it hard.
#hilary for ts#politics for ts#you can tell that biden is being effective and doing good things#because the republicans and the russians want so badly to stop him (and are working together to try it)#and the online leftists keep changing what they need to whine and lie about#oy fucking vey
453 notes
·
View notes
Note
Why americans always think is the russians intervening and not their own country ? Like you know the cia is known to intervene in elections across the world right.
Also the main social medias are all own by right wing billionaires like elon musk.
But nahh it must be the evil russians!
Because Russian bots are a real thing, maybe?????????????? Because Russia pushed Jill Stein to split the vote???????????? Because we know damn good and well that the Russian government does, in fact, do this shit?
#answered#politics#uspol#us politics#american politics#election 2024#2024 elections#voting#russian interference#voter interference#election interference#bots#psyops#jill stein
219 notes
·
View notes
Text
The fact that the Secret Service did not do its job, has no answers, the FBI has no answers only the local police have answers and the people that were there. We have seen the video we know what's happened. Let's say it, let's just say it out loud. The Democrats could not take out Trump in 2016 so they lied and cheated and stole to derail his campaign and administration. They've gone after him personally since then. Now they know once he gets into office in 2025 he will personally Helm investigations into all of this and people like Barack Obama, Joe Biden, the former director of the FBI, the 51 intelligence offices swore that Hunters laptop was Russian disinformation, Trump's going to go after everyone and we as Americans will be cheering him on.
#trump#trump 2024#president trump#ivanka#repost#america first#americans first#america#donald trump#democrats
169 notes
·
View notes
Text
This is a WIRED article from 2018 about Russia's propaganda/disinformation campaign on Tumblr.
The thing is, while Tumblr has addressed it in the past, it is very clear that Russia still uses trolls on this site. The list that they've provided of Russia-linked blogs back then isn't available anymore (I can't access it). The reason I'm posting this now is because I see a lot of people posting screenshots of pro-russian users and I want you to know that there's high chance that these users you are interacting with might be a troll spreading Russian propaganda. With the invasion of Ukraine these blogs must have intensified their activities so you can imagine the state of disinformation on this site.
Here's a useful article on how to spot russian bots/shills
347 notes
·
View notes
Text
In early September, the U.S. Department of Justice unveiled a series of sweeping investigations and indictments into Russian information projects aimed at disrupting the 2024 U.S. presidential election. One of these projects, which secretly funded right-wing influencers to promote former President Donald Trump’s campaign, is an escalation from prior Russian information operations, such as their email hacks during the 2016 election.
But another Russian team, described in a planning document published by the Department of Justice, approached disrupting the election a little bit less directly.
The Russian plan describes the “Good Ol’ USA Project” as a “guerrilla media” campaign intended to target “sentiments that should be exploited in the course of an information campaign in/for the United States.” Written by Ilya Gambashidze, a figure already facing sanctions for his disinformation work aimed at smearing Ukraine, the document suggests focusing influence efforts on the “community of American gamers, users of Reddit and image boards, such as 4Chan,” since they are the “backbone of the right-wing trends” online in the United States.
The inclusion of gamers in this campaign points at emerging dynamics in a global struggle over human rights online—one that policymakers need to pay closer attention to.
According to the Entertainment Software Association, a trade group, around 65 percent of Americans—or 212 million people—regularly play video games. Globally, video games generate more than $280 billion in revenue, far larger than traditional culture industries such as film or book publishing. While a trickle of stories about other attempts to push Russian propaganda in video games have attracted some scrutiny from journalists, the question remains: Apart from scale, what is it about gamers that Russia thinks will make them receptive to its messaging?
For starters, video game culture has already become an important venue for extremist right-wing groups to share and normalize their ideas. Far-right groups modify video games to be more explicitly racist and violent than their designers intended. Even gaming spaces designed for children, such as Roblox, which allows players to create their own game worlds and storylines, have attracted thousands of people (many of them young teenagers) to use the game’s freewheeling mechanics to play-act fascist violence.
The prevalence of hate groups has shaped video games into a place where culture and politics are debated, often contentiously, with predictable fault lines emerging along U.S. partisan boundaries. While the industry itself has made considerable progress in improving representation and reducing acts of horrific sexual violence, it has received pushback from far-right figures who are angry at the so-called “wokes” for supposedly “ruining games.”
For a decade, repeated efforts to “reclaim” gaming from an imagined enemy composed of women, Black people, and LGBT+ folks have bubbled up from the darkest corners of the internet, often in places such as Reddit (where this Russian campaign aimed its influence activity). These movements have spilled over into more mainstream political movements that can shape election outcomes. Consider how Gamergate, a 2014 campaign to terrorize women working in the industry into invisibility, metastasized into an online troll army working to get former U.S. President Donald Trump elected in 2016.
These far-right efforts are ongoing, even without Russian help. Last year, a group of gamers who were angry at inclusive representation in games launched a harassment campaign—colloquially called Gamergate 2.0—against a story consulting company.
Earlier this year, when Ubisoft began promoting the latest installment of its popular Assassin’s Creed franchise, this time set in feudal Japan, the trailer prominently featured Yasuke, an African man who served as a samurai in 16th-century Japan. Despite being based on a real historical figure, this movement (egged on by X owner and billionaire Elon Musk) raged at the decision, as if acknowledging Black people in the past was somehow bad. In their quest to sow division within the United States, Russian information operations analysts do not need to look very far to find political allies in gaming communities.
It helps that Russia enjoys greater social legitimacy in gaming than it does in, say, news journalism. You can see this legitimacy reflected in the language gamers that use as they play. Around the same time as Gamergate, a vulgar Russian phrase began popping up in the chats that players use to communicate with each other in non-Russian game streams, primarily in the multiplayer first-person shooter Counterstrike: Global Offensive. The game has around 4 million Russian players, and as it grew in popularity in the mid-2010s, the Russian obscenity cyka blyad became common invective during frustrating moments of play. Its widespread adoption among non-Russian-speaking gamers struck many as odd.
Cyka blyad rose in prominence alongside Russia’s descent into becoming an international pariah, which has limited the spaces where Russian gamers can play games online. In 2014, Russia passed a law requiring websites that store the personal information of Russian citizens to do so on servers inside the country. This was compounded in 2022, when companies ranging from Activision Blizzard to Nintendo protested the invasion of Ukraine by either suspending sales or shutting down Russia-specific services. Despite its residents representing around 10 percent of Counterstrike’s player base, there are no host servers for the game anywhere in Russia.
So, when Russian players log on to find players for a match, they use servers based in Europe or sometimes North America. These servers place them into direct contact with players on the other side of international conflicts—something that many players within the European Union found deeply frustrating after Russia’s illegal annexation of Crimea in 2014, as their games became places where people would argue about the annexation.
But another reason why Russian slang began infiltrating non-Russian gaming spaces is that after years of censorship and exclusion from both Russian and Western governments, games are one of the only spaces direct exchanges between ordinary Russian and Western people. Russians lack access to many Western social media platforms—such as Instagram (blocked by the Russian government in 2022, though earlier this year some Russian users regained access)—and were locked out of Western game stores, even as they kept access to many online games. As a result, matches in a game such as Counterstrike or Fortnite became one of the only places where these informal cultural exchanges could take place.
This narrowing of exchange spaces highlights how video games can become useful conduits for propaganda, and it demonstrates that video games are becoming an important, if underappreciated, site for ideological disputes over politics, speech, identity, and expression.
Other countries have begun to use video games for strategic communication. The U.S. government operates semiprofessional esports teams through the Defense Department, whose remit includes convincing young people to become interested in enlistment. China launched a military-produced first-person shooter game to boost recruitment and to humanize the image of the People’s Liberation Army abroad.
The Chinese developers of the hugely popular game Black Myth: Wukong instructed gaming influencers who were given early access to avoid discussing “feminist propaganda” while reviewing the game, apparently to adhere to government censorship rules. And Russian propaganda about the country’s war with Ukraine has begun appearing in games that allow user-generated content, such as Minecraft and the aforementioned Roblox, as the Kremlin seeks to persuade Westerners to end their support for Ukrainian freedom. In response, the U.S. State Department has begun developing its own games intended to train players to resist Russian disinformation.
This isn’t an abstract challenge. Scholars have drawn linkages between Russia’s propaganda efforts and President Vladimir Putin’s decision to launch the full-scale invasion of Ukraine in 2022 as both bots and human agents aggressively pushed narratives about the need to “pacify” an ostensibly violent Ukraine by invading. The invasion was further justified by the myth of Novorossiya, or a pan-Russian identity that views Ukrainians as misguided Russians who need to be forcibly reclaimed.
These efforts to spread propaganda through gaming are rarely successful. Few people wanted to, say, join Hezbollah after the Lebanese militant group launched its own game, Special Force, in 2003. The terrorist group al Qaeda has used video games for recruitment since 2006, but there is little evidence that any meaningful number of people have been recruited because of it. Scholars have fretted for years over the “militarization” of video games as the Pentagon gets more and more involved in the industry, yet U.S. military recruitment is in long-term decline, and public confidence in the nation’s military is at a two-decade low. If games-based propaganda works, we do not yet know where or how it does.
The revelations about Russian video game propaganda hint at some intriguing innovation in how strategic messages might be spread through nontraditional channels, but they also point to the areas where traditional channels for propaganda have closed down. Despite efforts by Republicans in Congress to falsely accuse agencies such as the Global Engagement Center of partisan bias when addressing foreign misinformation, the U.S. government takes the challenge seriously and, as this 2022 report on the propaganda channels , is working to thwart many of Russia’s best efforts to target Americans with propaganda, like when they sanctioned several Russian oligarchs who had been financing US-directed misinformation.
But even beyond government counterprogramming, there are plenty of obstacles to Russia’s efforts within the world of gaming. For example, Ukraine’s video games industry is respected in the United States and Europe. The developer 4A, which was based in Kyiv before the invasion, produces popular games such as the Metro franchise. That company, however, had to fly its employees abroad to keep them safe from the indiscriminate Russian barrages against the Ukrainian capital and other cities. This sent shockwaves through the industry, as it made some of the consequences of the invasion seem more viscerally real even to people who do not follow politics closely.
As a result of American and European sanctions, Russians have a more difficult experience legally purchasing software and services such as online gaming. (Some Russian game companies have since relocated abroad to more neutral countries, such as Cyprus, to continue operating globally). Wargaming, the Belarusian company that makes World of Tanks, also fled to Cyprus, which has become an informal hub of Russian game companies.
Looking forward, there are real questions about what video games are going to become in the information war between Russia and the West. Russian censors have proposed deploying neural networks to search for banned content in games, but it is unclear whether those systems might disrupt gaming for everyone else.
Long before the invasion of Ukraine, Moscow forced Activision to censor the infamous airport sequence in the rerelease of Call of Duty: Modern Warfare 2 (in which players assume the role of a Chechen terrorist and can slaughter hundreds of screaming civilians at Sheremetyevo International Airport), and it has not been shy about using government coercion to erase LGBT+ people from gaming (paralleling to its embrace of the U.S. far right). Policymakers should look at ensuring that global communication platforms—and that is what video games are—remain open to free speech and safeguard other basic human rights.
While the latest Russian effort to target games seems to have been thwarted by the U.S. Justice Department, there will undoubtedly be more programs looking to repeat and extend the success of Gamergate in empowering the far right, perhaps this time by enabling it to obstruct effective governance in the United States. The Good Ol’ USA Project also targeted its influence operations toward websites such as Reddit and 4Chan, both of which are as central to the sustainment of far-right movements as gaming. Emerging platforms—which range from popular Chinese games to channels such as the online chat service Discord, which is difficult to monitor at scale and routinely hosts leaks of sensitive military documents—present new opportunities where Russian influence could be targeted.
These strange spaces are the frontier where a global battle for speech is being fought.
99 notes
·
View notes
Text
Jay Kuo at Think Big Picture:
For years, critics of Vladimir Putin have been warning that the Russians have taken over parts of the Republican Party. They raised the alarm as Republicans defended the Russian leader, parroted clear Kremlin talking points, and became mules for disinformation campaigns. In recent weeks, that criticism has shifted to include not just Republicans who have left the party, including former representatives Liz Cheney and Adam Kinzinger, but current GOP members. Recently, two powerful Republican chairs of the House Intelligence Committee and the House Foreign Affairs Committee warned openly about how Russian propaganda has seeped into their party and even made its way into speeches on the House floor. Other members are now even openly questioning whether some of their fellow officials have been compromised and are being extorted. Rep. Tim Burchett (R-TN) suggested in a recent interview that the Russian spies may possess compromising tapes of some of his colleagues. It’s unclear where he’s getting his information or how accurate it is.
And then there’s this: According to a report by Politico, a number of European politicians were recently paid by Moscow to interfere in the upcoming EU elections by Russians pretending to be a “media” outlet called “Voice of Europe.” The Kremlin-backed operation used money to influence officials to take pro-Russian stances. Authorities have conducted some money seizures and launched an investigation into which members of the European Parliament may have accepted cash bribes. This in turn raises an important question for our own politics: Are the Russians doing the same with U.S. politicians, directly or indirectly? This piece walks through the three types of compromise—disinformation, extortion, and bribery—to give a sense of what we know and what we don’t really know, and, importantly, where we should be on our guard. As this summary will show, from the 2016 election till now, there’s enough Russian smoke now to assume there is a fire, one that compromises not only the integrity of our own system of elections, but the safety and security of the free world. Duped.
Over the past year, we have witnessed two distinct kinds of Russian propaganda in action. Both use our own elected officials and intelligence processes to amplify and even weaponize disinformation. The first kind originates online through Russian-backed internet channels. Information operatives begin spreading false rumors, for example about Ukraine, that then get repeated within right-wing silos before reaching willing purveyors of it within the halls of Congress. A chief culprit in Congress is Georgia’s Rep. Marjorie Taylor Greene. Among the Russian-originated false narratives she has uplifted is the patently false claim that Ukraine is waging a war against Christianity while Russia is protecting it. On Steve Bannon’s War Room podcast, Greene even claimed, without evidence, that Ukraine is “executing priests.”
Where would Greene have gotten this wild, concocted notion? We don’t have to look far. Russian talking points have included this gaslighting narrative for some time. The twist, of course, is that, according to the International Religious Freedom or Belief Alliance, it is the Russian army that has been torturing and executing priests and other religious figures, including 30 Ukrainian clergy killed and 26 held captive by Russian forces. The Russians have also targeted Baptists, whom they see as U.S. propagandists, according to an in-depth Time magazine piece on the violence and death directed toward evangelicals. The Congressional propaganda mouthpieces for Russia aren’t limited to the U.S. House. Over in the Senate, Ohio Senator J.D. Vance was also recently accused of spreading Kremlin-backed disinformation about Ukraine, this time over spurious allegations that Ukrainian President Volodymyr Zelenskyy siphoned U.S. aid to purchase himself two luxury yachts.
[...]
The accusation that Russians are presently extorting and blackmailing U.S. politicians into supporting Russia’s agenda has some broad appeal. It would help explain some mysteries, including why people like Sen. Lindsey Graham (R-SC) suddenly is no longer as supportive of Ukraine as before and constantly kisses the ring of Donald Trump these days—after presciently saying in 2016 that the GOP would destroy itself if it nominated him.
The problem has been that these accusations aren’t supported by much evidence. That means that political extortion by the Russians is either not a very prevalent practice, or it’s so effective that no one dares expose it. Either way, we’re left without much to go on. The Russian word kompromat came into common parlance around the time that Buzzfeed published a salacious story about another intelligence report back in early 2017. In that instance, the author, a former British intelligence officer named Christopher Steele, was concerned Russia had compromising data on the soon-to-be president, Donald Trump.
That report never wound up being substantiated, and its sources and funding came into question as well. But intelligence agencies are in general agreement that obtaining kompromat is standard practice by Russia, and someone like Trump could have been an easy mark considering the company that he kept (e.g. Jeffrey Epstein and Ghislaine Maxwell) and the projects he was involved with (e.g. the Miss Universe contest). Lately, the notion of kompromat emerged once again, this time not from Democratic-paid outfits but from within the GOP itself. Rep. Tim Burchett (R-TN) is one of the more “colorful” characters within the GOP, primarily known lately for being one of the eight members who voted to oust former Speaker Kevin McCarthy and even for getting into public jostling and shouting matches with McCarthy.
The Republican Party (or at least its pro-MAGA faction) is compromised by Russian kompromat.
#Trump Russia Scandal#GOP Russia#Russia#Donald Trump#Marjorie Taylor Greene#J.D. Vance#Volodymyr Zelensky#Tim Burchett#War Room#Stephen Bannon#Mike Turner#Michael McCaul#Christopher Steele
220 notes
·
View notes
Text
You're being targeted by disinformation networks that are vastly more effective than you realize. And they're making you more hateful and depressed.
(This essay was originally by u/walkandtalkk and posted to r/GenZ on Reddit two months ago, and I've crossposted here on Tumblr for convenience because it's relevant and well-written.)
TL;DR: You know that Russia and other governments try to manipulate people online. But you almost certainly don't how just how effectively orchestrated influence networks are using social media platforms to make you -- individually-- angry, depressed, and hateful toward each other. Those networks' goal is simple: to cause Americans and other Westerners -- especially young ones -- to give up on social cohesion and to give up on learning the truth, so that Western countries lack the will to stand up to authoritarians and extremists.
And you probably don't realize how well it's working on you.
This is a long post, but I wrote it because this problem is real, and it's much scarier than you think.
How Russian networks fuel racial and gender wars to make Americans fight one another
In September 2018, a video went viral after being posted by In the Now, a social media news channel. It featured a feminist activist pouring bleach on a male subway passenger for manspreading. It got instant attention, with millions of views and wide social media outrage. Reddit users wrote that it had turned them against feminism.
There was one problem: The video was staged. And In the Now, which publicized it, is a subsidiary of RT, formerly Russia Today, the Kremlin TV channel aimed at foreign, English-speaking audiences.
As an MIT study found in 2019, Russia's online influence networks reached 140 million Americans every month -- the majority of U.S. social media users.
Russia began using troll farms a decade ago to incite gender and racial divisions in the United States
In 2013, Yevgeny Prigozhin, a confidante of Vladimir Putin, founded the Internet Research Agency (the IRA) in St. Petersburg. It was the Russian government's first coordinated facility to disrupt U.S. society and politics through social media.
Here's what Prigozhin had to say about the IRA's efforts to disrupt the 2022 election:
"Gentlemen, we interfered, we interfere and we will interfere. Carefully, precisely, surgically and in our own way, as we know how. During our pinpoint operations, we will remove both kidneys and the liver at once."
In 2014, the IRA and other Russian networks began establishing fake U.S. activist groups on social media. By 2015, hundreds of English-speaking young Russians worked at the IRA. Their assignment was to use those false social-media accounts, especially on Facebook and Twitter -- but also on Reddit, Tumblr, 9gag, and other platforms -- to aggressively spread conspiracy theories and mocking, ad hominem arguments that incite American users.
In 2017, U.S. intelligence found that Blacktivist, a Facebook and Twitter group with more followers than the official Black Lives Matter movement, was operated by Russia. Blacktivist regularly attacked America as racist and urged black users to rejected major candidates. On November 2, 2016, just before the 2016 election, Blacktivist's Twitter urged Black Americans: "Choose peace and vote for Jill Stein. Trust me, it's not a wasted vote."
Russia plays both sides -- on gender, race, and religion
The brilliance of the Russian influence campaign is that it convinces Americans to attack each other, worsening both misandry and misogyny, mutual racial hatred, and extreme antisemitism and Islamophobia. In short, it's not just an effort to boost the right wing; it's an effort to radicalize everybody.
Russia uses its trolling networks to aggressively attack men. According to MIT, in 2019, the most popular Black-oriented Facebook page was the charmingly named "My Baby Daddy Aint Shit." It regularly posts memes attacking Black men and government welfare workers. It serves two purposes: Make poor black women hate men, and goad black men into flame wars.
MIT found that My Baby Daddy is run by a large troll network in Eastern Europe likely financed by Russia.
But Russian influence networks are also also aggressively misogynistic and aggressively anti-LGBT.
On January 23, 2017, just after the first Women's March, the New York Times found that the Internet Research Agency began a coordinated attack on the movement. Per the Times:
More than 4,000 miles away, organizations linked to the Russian government had assigned teams to the Women’s March. At desks in bland offices in St. Petersburg, using models derived from advertising and public relations, copywriters were testing out social media messages critical of the Women’s March movement, adopting the personas of fictional Americans.
They posted as Black women critical of white feminism, conservative women who felt excluded, and men who mocked participants as hairy-legged whiners.
But the Russian PR teams realized that one attack worked better than the rest: They accused its co-founder, Arab American Linda Sarsour, of being an antisemite. Over the next 18 months, at least 152 Russian accounts regularly attacked Sarsour. That may not seem like many accounts, but it worked: They drove the Women's March movement into disarray and eventually crippled the organization.
Russia doesn't need a million accounts, or even that many likes or upvotes. It just needs to get enough attention that actual Western users begin amplifying its content.
A former federal prosecutor who investigated the Russian disinformation effort summarized it like this:
It wasn’t exclusively about Trump and Clinton anymore. It was deeper and more sinister and more diffuse in its focus on exploiting divisions within society on any number of different levels.
As the New York Times reported in 2022,
There was a routine: Arriving for a shift, [Russian disinformation] workers would scan news outlets on the ideological fringes, far left and far right, mining for extreme content that they could publish and amplify on the platforms, feeding extreme views into mainstream conversations.
China is joining in with AI
[A couple months ago], the New York Times reported on a new disinformation campaign. "Spamouflage" is an effort by China to divide Americans by combining AI with real images of the United States to exacerbate political and social tensions in the U.S. The goal appears to be to cause Americans to lose hope, by promoting exaggerated stories with fabricated photos about homeless violence and the risk of civil war.
As Ladislav Bittman, a former Czechoslovakian secret police operative, explained about Soviet disinformation, the strategy is not to invent something totally fake. Rather, it is to act like an evil doctor who expertly diagnoses the patient’s vulnerabilities and exploits them, “prolongs his illness and speeds him to an early grave instead of curing him.”
The influence networks are vastly more effective than platforms admit
Russia now runs its most sophisticated online influence efforts through a network called Fabrika. Fabrika's operators have bragged that social media platforms catch only 1% of their fake accounts across YouTube, Twitter, TikTok, and Telegram, and other platforms.
But how effective are these efforts? By 2020, Facebook's most popular pages for Christian and Black American content were run by Eastern European troll farms tied to the Kremlin. And Russia doesn't just target angry Boomers on Facebook. Russian trolls are enormously active on Twitter. And, even, on Reddit.
It's not just false facts
The term "disinformation" undersells the problem. Because much of Russia's social media activity is not trying to spread fake news. Instead, the goal is to divide and conquer by making Western audiences depressed and extreme.
Sometimes, through brigading and trolling. Other times, by posting hyper-negative or extremist posts or opinions about the U.S. the West over and over, until readers assume that's how most people feel. And sometimes, by using trolls to disrupt threads that advance Western unity.
As the RAND think tank explained, the Russian strategy is volume and repetition, from numerous accounts, to overwhelm real social media users and create the appearance that everyone disagrees with, or even hates, them. And it's not just low-quality bots. Per RAND,
Russian propaganda is produced in incredibly large volumes and is broadcast or otherwise distributed via a large number of channels. ... According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.
What this means for you
You are being targeted by a sophisticated PR campaign meant to make you more resentful, bitter, and depressed. It's not just disinformation; it's also real-life human writers and advanced bot networks working hard to shift the conversation to the most negative and divisive topics and opinions.
It's why some topics seem to go from non-issues to constant controversy and discussion, with no clear reason, across social media platforms. And a lot of those trolls are actual, "professional" writers whose job is to sound real.
So what can you do? To quote WarGames: The only winning move is not to play. The reality is that you cannot distinguish disinformation accounts from real social media users. Unless you know whom you're talking to, there is a genuine chance that the post, tweet, or comment you are reading is an attempt to manipulate you -- politically or emotionally.
Here are some thoughts:
Don't accept facts from social media accounts you don't know. Russian, Chinese, and other manipulation efforts are not uniform. Some will make deranged claims, but others will tell half-truths. Or they'll spin facts about a complicated subject, be it the war in Ukraine or loneliness in young men, to give you a warped view of reality and spread division in the West.
Resist groupthink. A key element of manipulate networks is volume. People are naturally inclined to believe statements that have broad support. When a post gets 5,000 upvotes, it's easy to think the crowd is right. But "the crowd" could be fake accounts, and even if they're not, the brilliance of government manipulation campaigns is that they say things people are already predisposed to think. They'll tell conservative audiences something misleading about a Democrat, or make up a lie about Republicans that catches fire on a liberal server or subreddit.
Don't let social media warp your view of society. This is harder than it seems, but you need to accept that the facts -- and the opinions -- you see across social media are not reliable. If you want the news, do what everyone online says not to: look at serious, mainstream media. It is not always right. Sometimes, it screws up. But social media narratives are heavily manipulated by networks whose job is to ensure you are deceived, angry, and divided.
Edited for typos and clarity. (Tumblr-edited for formatting and to note a sourced article is now older than mentioned in the original post. -LV)
P.S. Apparently, this post was removed several hours ago due to a flood of reports. Thank you to the r/GenZ moderators for re-approving it.
Second edit:
This post is not meant to suggest that r/GenZ is uniquely or especially vulnerable, or to suggest that a lot of challenges people discuss here are not real. It's entirely the opposite: Growing loneliness, political polarization, and increasing social division along gender lines is real. The problem is that disinformation and influence networks expertly, and effectively, hijack those conversations and use those real, serious issues to poison the conversation. This post is not about left or right: Everyone is targeted.
(Further Tumblr notes: since this was posted, there have been several more articles detailing recent discoveries of active disinformation/influence and hacking campaigns by Russia and their allies against several countries and their respective elections, and barely touches on the numerous Tumblr blogs discovered to be troll farms/bad faith actors from pre-2016 through today. This is an ongoing and very real problem, and it's nowhere near over.
A quote from NPR article linked above from 2018 that you might find familiar today: "[A] particular hype and hatred for Trump is misleading the people and forcing Blacks to vote Killary. We cannot resort to the lesser of two devils. Then we'd surely be better off without voting AT ALL," a post from the account said.")
#propaganda#psyops#disinformation#US politics#election 2024#us elections#YES we have legitimate criticisms of our politicians and systems#but that makes us EVEN MORE susceptible to radicalization. not immune#no not everyone sharing specific opinions are psyops. but some of them are#and we're more likely to eat it up on all sides if it aligns with our beliefs#the division is the point#sound familiar?#voting#rambles#long post
160 notes
·
View notes
Photo
(via Voters focused on the economy broke hard for Trump | AP News)
Those who panic-voted for Trump based on their faulty understanding of the economy were the victims of a relentless campaign of disinformation. Even as the economic numbers kept improving, the Trump machine (assisted by Russian and Iranian bots) blared that it was bad and getting worse, without any evidence. And millions of Americans believed the lies because, without critical thinking skills, they are defenseless against lies. Once they’ve committed to the lies, they are stubborn and arrogant in defending their irrational choice because they think stubbornness and arrogance are virtues.
Yeah, it’s the economy, stupid. Which Trump supporters got completely wrong. Worse, for a few pieces of silver, they were willing to brush aside all the moral and social issues of having a rapist, racist, and fraudster as their leader. They sold their souls—and country—for cheaper eggs.
71 notes
·
View notes
Text
Over half of anti-Heard tweets were bots or paid trolls, many linked to Saudi government bots.
"According to an investigation by Tortoise Media, which examined more than one million tweets, more than 50 per cent of anti-Heard messages in the run-up to the 2022 defamation case were "inauthentic' - either from automated "bot" accounts or people hired to attack the actress."
"Bradley Hope, author of a book on Bin Salman, told the podcast that the pro-Depp tweets emanating from Saudi Arabia appear to be produced by "flies", a name for Saudi bot accounts."
"An intelligence professional who tracks online disinformation campaigns, said there was only a "0.1 per cent chance" that the hate directed at Heard was from genuine Depp fans.
The investigation also claims that bot networks in Thailand and Spain tweeted large numbers of pro-Depp messages."
"...more than 100 Twitter accounts sent 1,000 identical messages at exactly the same time to any company that had worked with Heard, reading: "This brand supports domestic violence against men."'
"The makers of the podcast argue that the criticism of Heard could have affected the jury in the 2022 US defamation trial which found in favour of Depp."
"So, if you couldn't tell the difference between a real-life Johnny Depp fan and a bot in 2022, then you probably won't be able to tell a Russian troll from a US election official in 2024. And that represents a serious problem for the security of our democracies."-Alexi Mostrous, presenter of the podcast.
"Johnny Depp and the Saudi Embassy did not respond to Tortoise's request for comment."
#Amber Heard#Depp v Heard#Johnny Depp#Johnny Depp Is A Wife-Beater#Johnny Depp Is A Rapist#Tortoise Media#Who Trolled Amber?#Mohammed bin Salman#Prince Bone Saw#Propaganda#Misinformation#Bots#Foreign Influence Campaign#Indict Johnny Depp#Indict Adam Waldman#Disbar Adam Waldman#Johnny Depp Is A Saudi Asset#Probably A Russian One Too#I Stand With Amber Heard
166 notes
·
View notes