Russia’s propaganda operation is failing

Discussion in 'Politics' started by gwb-trading, Mar 7, 2022.

  1. I recently heard a pundit bloviating and he mentioned something relevant.

    He said that one of the things that works against Putin a bit is that China has not censored or has not been successful in censoring reports and video from the Ukraine invasion - or at least did not early on- I guess because they did not know they would be involved publicly anyway.

    And as a result, many Chinese have formed an opinion which is not helpful to making the case that the war is necessary or just- if not outrght fucking genocide. Let's face it. The optics are not great. The little dead girl with the snoopy pajamas on probably was not a Nazi. This makes it harder for Xi to come in midstream and announce that he is providing weapons and propping Russia up. I mean he is doing that anyway but breaking through China's propaganda has made it harder for him.
     
    #91     Mar 18, 2022
  2. gwb-trading

    gwb-trading

    Putin's Russia continually attempts to push misinformation on social media. Nearly all of it is easily proven as false. Yet there are idiots who believe Russia's nonsense and re-post it.

    How Kremlin accounts manipulate Twitter
    https://www.bbc.com/news/technology-60790821

    Olena Kurilo became the face of Russia's invasion of Ukraine. Bloodied and bandaged, the 53-year-old teacher said she couldn't believe what had happened to her and her town of Chuhuiv.

    Her picture was on the front pages of newspapers across the world.

    Over the next few days, Russia's government social media accounts began to post a video claiming that Olena hadn't been injured at all.

    [​IMG]

    "Great photos by the way, they were all over the news," the Russian narrator says.

    The video then claims Olena was photographed two days later, uninjured.

    "A couple of days later, good for her, not a scratch."

    This claim is baseless, the BBC has verified the photo as genuine, as has Reuters. Wild conspiracy theories like these are not uncommon on social media.

    But what makes this conspiracy theory so odd is that it was shared by an official Russian government Twitter account - the Russian Mission in Geneva. Two weeks on, the tweet is still live.

    The Russian government has a huge network of official Twitter accounts - the BBC found more than 100 of them. They range from accounts that represent foreign missions or embassies, with a few thousand followers, to those with more than a million followers. President Putin has his own account. Many of the accounts are labelled as Russian government organisations by Twitter.

    [​IMG]

    Yet, while many of these accounts have spread disinformation, Twitter deals with them differently to Russian state media - like RT or Sputnik.

    On 28 February, Twitter announced it would prevent tweets from Russian state-affiliated media outlets from being eligible for "amplification" - meaning they wouldn't be recommended in the home timeline, notifications, and other places on Twitter. But Twitter has confirmed to the BBC that this policy does not include Russian government accounts.

    Tim Graham, a social media analyst at QUT Digital Media Research Centre in Australia, describes this as a "loophole" in Twitter's moderation policies, which lets the Russian government pump out misinformation.

    "It's certainly a blind spot in Twitter's defences against disinformation," he says.

    Intrigued by this spider web of Russian government accounts, Mr Graham - who specialises in analysing co-ordinated activity on social media - decided to investigate further. He analysed 75 Russian government Twitter profiles which, in total, have more than 7 million followers. The accounts have received 30 million likes, been retweeted 36 million times and been replied to 4 million times.

    He looked at how many times each Twitter account retweeted one of the other 74 profiles within an hour. He discovered that the Kremlin's network of Twitter accounts work together to retweet and drive up traffic. This practice is sometimes called "astroturfing" - when the owner of several accounts uses the profiles they control to retweet content and amplify reach.

    "It's a coordinated retweet network," Mr Graham says.

    "If these accounts weren't retweeting stuff at the same time, the network would just be a bunch of disconnected dots. So what the network shows, very clearly, is that there's a very dense amount of connections to the way these accounts are retweeting.

    "They are using this as an engine to drive their preferred narrative onto Twitter, and they're getting away with it," he says.

    Coordinated activity, using multiple accounts, is against Twitter's rules.

    "You can't artificially amplify conversations through the use of multiple accounts," Twitter's rules state.

    But Twitter doesn't treat all accounts equally. Tweets from government and elected officials can be given more leeway when it comes to moderation. The company says on its website that there may be a public interest in seeing tweets that would otherwise violate its rules.

    However, the company doesn't treat official accounts differently when it comes to coordinated behaviour - there is no exemption.
     
    #92     Mar 19, 2022
  3. gwb-trading

    gwb-trading

    It's amazing to see how all the antivax Covid-denier social media accounts have pivoted to pushing pro-Putin Ukraine information. The proper explanation -- all of these posters & bots are part of the Russian disinformation network and have pivoted their subject in recent weeks. It's all a deliberate attempt to undermine and divide the West. And yes, this includes some clowns on ET.

    ‘Bot holiday’: Covid disinformation down as social media pivot to Ukraine
    The usual deluge of invective prompted by coronavirus and vaccine issues is absent – Russia’s invasion may be a factor
    https://www.theguardian.com/media/2022/mar/04/bot-holiday-covid-misinformation-ukraine-social-media

    When David Fisman tweets, he often receives a deluge of hate within moments of posting. Fisman, an epidemiologist and physician, has been outspoken about Covid and public health.

    Even when he tweets something innocuous – once, to test his theory, he wrote the banal statement “kids are remarkable” – he still receives a flood of angry pushback.

    But in recent days, Fisman noticed an “astounding” trend, he said. He posted about topics like requiring vaccination and improving ventilation to prevent the spread of Covid – and the nasty responses never came. No support for the trucker convoy, no calls to try the Canadian prime minister, Justin Trudeau, for treason.

    Others have observed the same phenomenon; those who frequently encounter bots or angry responses are now seeing a significant drop-off. Covid misinformation, which has often trended on social media over the past two years, seems to be taking a nosedive.

    The reasons for this “bot holiday”, as Fisman calls it, are probably varied – but many of them point to the Russian invasion of Ukraine.

    Russia’s information war with western nations seems to be pivoting to new fronts, from vaccines to geopolitics.

    And while social media has proven a powerful tool for Ukraine – with images of Zelenskiy striding through the streets of Kyiv and tractors pulling abandoned Russian tanks – growing campaigns of misinformation around the world could change the conflict’s narrative, and the ways the world reacts.

    The likely reasons for the shift in online chatter are many. Russia began limiting access to Twitter on Saturday, sanctions have been levied against those who could be financing disinformation sites and bot farms, and social media companies are more attuned to banning bots and accounts spreading misinformation during the conflict.

    But something more coordinated may also be at play.

    Conspiracy theories around the so-called “New World Order” – loosely defined conspiracies about shadowy global elites that run the world – have converged narrowly on Ukraine, according to emerging research.

    “There’s actually been a doubling of New World Order conspiracies on Twitter since the invasion,” said Joel Finkelstein, the chief science officer and co-founder of the Network Contagion Research Institute at Rutgers University, which maps online campaigns around public health, economic issues and geopolitics.

    At the same time, “whereas before the topics were very diverse – it was Ukraine and Canada and the virus and the global economy – now the entire conversation is about Ukraine,” he said. “We’re seeing a seismic shift in the disinformation sphere towards Ukraine entirely.”

    Online activity has surged overall by 20% since the invasion, and new hashtags have cropped up around Ukraine that seem to be coordinated with bot-like activity, Finkelstein said. Users pushing new campaigns frequently tweet hundreds of times a day and can catch the eye of prominent authentic accounts.

    “We can’t say for certain that Russia is behind this or that it contributes directly to the propagation of these messages. But it’s pretty difficult to believe that it’s not involved,” Finkelstein said, with topics strikingly similar to Russian talking points about the Ukrainian president, Volodymyr Zelenskiy, being controlled by the west and the need to dissolve Nato.

    A Russian bot farm reportedly produced 7,000 accounts to post fake information about Ukraine on social media, including Telegram, WhatsApp and Viber,accordingto the security service of Ukraine.

    And influencers who previously demonstrated against vaccines are now turning their support to Russia.

    Social media users may see a topic trending and not realize its connection to conspiracy theories or disinformation campaigns, said Esther Chan, Australia bureau editor for First Draft, an organization that researches misinformation.

    “A lot of social media users may just use these terms because they’re trending, they sound good,” she said. “It’s a very clever sort of astroturfing strategy that we’ve seen in the past few years.”

    The topics pushed by troll farms and Russian state media are often dictated by Russian officials, said Mitchell Orenstein, a professor of Russian and east European studies at University of Pennsylvania and a senior fellow of the Foreign Policy Research Institute.

    In this case, it seems “their orders got changed because priorities shifted”, he said.

    Russia has coordinated significant misinformation campaigns to destabilize western countries, including topics like the 2016 election and the pandemic, according to several reports.

    Inauthentic accounts are not fully responsible for real hesitations and beliefs. But they amplify harmful messages and make pushback seem more widespread than it is.

    “They’ve had tremendous success with social media platforms,” Orenstein said. “They play a pretty substantial role and they do shift people’s perception about what opinion is.”

    Fake accounts will frequently link to “pink slime” or low-credibility sites that once carried false stories about the pandemic and are now shifting focus to Ukraine, said Kathleen Carley, a professor at Carnegie Mellon University.

    “The bots themselves don’t create news – they’re more used for amplification,” she said.

    These sites frequently sow division on controversial issues, research finds, and they make it more difficult to spot disinformation online.

    The escalation of narratives like these could have wide-ranging consequences for policy.

    “Right now, we’re in the beginning of a war that has a consensus, right? It’s clear that what Russia’s doing is against the moral order of the modern world. But as the war becomes prolonged, and people become exhausted, that may change,” Finkelstein said.

    As “we enter into more unknown territory, these narratives will have a chance to grow … it gives us a window into what these themes are going to be like.”

    The research around these changing campaigns is limited, looking at thousands of tweets in the early days of an invasion, Carley cautioned. It’s very early to understand what direction the misinformation is going and who is behind it – and conspiracies tend to follow current events even when there aren’t coordinated campaigns.

    And “that does not mean that all the disinformation, all the conspiracy theories about Covid are not still there,” she said. “I would not say the bots are on holiday. They have been re-targeted at different stories now, but they’ll be back.”

    On 3 March the surgeon general, Vivek Murthy, asked tech firms to cough up what they know about who is behind Covid-19 misinformation. Murthy wants social networks, search engines, crowdsourced platforms, e-commerce and instant messaging companies to provide data and analysis on the kind of vaccine misinformation identified by the CDC, such as “the ingredients in COVID-19 vaccines are dangerous” and “COVID-19 vaccines contain microchips”.

    Misinformation campaigns around the New World Order, however, have more longevity than some other conspiracy theories, because they can quickly morph depending on the target. “They probably will still exist for a long time,” Chan said. “The question for us is whether they would have an impact on people – on real life and also on policymaking.”

    It may be too soon to say what’s emerging during the invasion of Ukraine, but leaders should understand what terms are emerging in conspiracy theories and disinformation campaigns so they don’t inadvertently signal support for the theories in their public statements, she said.

    “They need to take note of what terms are commonly used and try to avoid them,” Chan said.

    A global agreement on how to address misinformation or disinformation would be key, Carley said.

    “Each country does it separately. And the thing is, because we’re all connected very tightly throughout the world in social media, it doesn’t matter that one country has some strong reactions because it’ll still go from another country’s machines on to your machines,” she said.

    Such rules would also need to have teeth to prevent further campaigns, she said. And educating the public about how to parse misinformation and disinformation is also important. “We need to start investing better in critical thinking and digital media literacy.”
     
    #93     Mar 19, 2022
  4. gwb-trading

    gwb-trading


    Russia's Deepfakes continue to fail...

    Russia targets THIRD cabinet member with chilling deepfake telephone call
    A THIRD Cabinet minister was target by a deepfake phone call, No10 admitted today as it publicly blamed Russia for the security breach for the first time.
    https://www.express.co.uk/news/poli...-dorries-deepfake-telephone-call-hoax-wallace
     
    #94     Mar 22, 2022
  5. gwb-trading

    gwb-trading

    Another day, another state broadcaster quits in Russia.

    Russian state TV presenter quits over war and slams Putin's 'insane' leadership
    Gleb Irisov, a former military translator in Syria, quit as military correspondent at state-owned TASS, revealing several of his former army comrades died on the first day of the invasion
    https://www.mirror.co.uk/news/world-news/russian-state-tv-presenter-quits-26528948
     
    #95     Mar 22, 2022
  6. gwb-trading

    gwb-trading

     
    #96     Mar 23, 2022


  7. Scott Ritter is a former Marine Corps intelligence officer who served in the former Soviet Union implementing arms control treaties, in the Persian Gulf during Operation Desert Storm, and in Iraq overseeing the disarmament of WMD
     
    #97     Mar 24, 2022
  8. gwb-trading

    gwb-trading

    Scott Ritter is serial liar and convicted felony sex offender. He is not a quality source for anything. Everything that comes out of his mouth is a joke... and a fabrication from his twisted mind.

    Ask Scott Ritter how enjoyed his 5 and a half years in prison as a sex offender. Ask him just how many times he was involved in chatting up under-age girls for sex -- seeing that he was caught three times.


    It's sad that you cite Ritter as a source.

    Underage sex trial of former U.N. weapons inspector opens
    https://www.reuters.com/article/us-...weapons-inspector-opens-idUSTRE73B6H620110412

    A jury in the underage sex trial of an outspoken former United Nations weapons inspector Scott Ritter on Tuesday watched a nude video of him sent over the Internet to a person he thought was a 15-year-old girl.

    The person who received the photo was not the girl but Detective Ryan Venneman of Barrett Township, one of the first witnesses to testify on opening day of the trial in Monroe County Common Pleas Court.

    Ritter, 49, of Delmar, New York, a suburb of Albany, faces a maximum of seven years in prison if he’s convicted.

    Prosecutors said Ritter’s online chat with 15-year-old “Emily” in February 2009 was actually the third such encounter since he quit his job as chief U.N. weapons inspector in Iraq in 1998 and became a vocal critic of the Bush administration’s war in Iraq.

    In 2001, Ritter was involved in two other similar sex sting cases, prosecutor Michael Rakaczewski said in his opening statement.

    Defense attorney Gary Kohlman said Ritter was never charged in 2001 and that the case had been sealed.

    Kohlman also blamed Ritter’s behavior in 2001 on his state of depression over resigning as chief U.N. weapons inspector.

    As the trial got underway, Ritter sat stoically at the defense table while Detective Venneman read a transcript of the online chat in February of 2009. The text was explicit, and Ritter’s twin 18-year-old daughters sat through the reading.

    Ritter’s daughters left the courtroom before the prosecution played graphic video Ritter shot of himself with his web camera during the chat.


    Ex-UN inspector Scott Ritter guilty in sex chat case
    https://www.bbc.com/news/world-us-canada-13089135
     
    #98     Mar 24, 2022
  9. gwb-trading

    gwb-trading

    Let's see the sad propaganda that Russia is shoveling today...

    Russian State TV Airs Shocking Aerial Footage of Completely Destroyed Mariupol — Host Blames Ukrainians for Destruction
    https://www.mediaite.com/tv/russian...iupol-host-blames-ukrainians-for-destruction/

    [​IMG]

    Russian state television aired shocking aerial footage of the now-completely destroyed Ukrainian coastal city of Mariupol, which elicited audible stunned reactions from a co-host. According to a translation from Washington Post Russia reporter Mary Ilyushina, the news presenter blamed the devastation on “Ukrainian nationalists.”

    Ilyushina posted the 17-second clip, adding “Russian state TV posts this absolutely apocalyptic aerial footage of Mariupol (result of Russian siege). But the anchor says: ‘Sad scenes of course… the Ukrainian nationalists withdraw trying not to leave a stone unturned.'” The footage is indeed stunning, which you can see below:



    Mariupol has been so thoroughly bombarded that it seems that no western journalists are still there to cover the devastation. Two AP reporters were the last to stay and cover the bombing of a maternity hospital, which they did by disguising themselves in hospital scrubs.

    Russian television is operated and directed by the Kremlin and has been only reporting the most charitable versions of events occurring in Ukraine following Russia’s military invasion, which has led to the death of innocents and formal charges of War Crimes against Russia.
     
    #99     Mar 24, 2022
  10. although I am part of that delusional West...



     
    Last edited: Mar 25, 2022
    #100     Mar 25, 2022