Twitter and Musk

Discussion in 'Politics' started by VicBee, Oct 31, 2022.

  1. exGOPer

    exGOPer

    Which is why he has been cutting staff...
     
    #1941     Jun 8, 2023
  2. Mercor

    Mercor

    Its kind of like CNN on TV at the airport. It counts as viewers but it mostly is a sleep aid
     
    #1942     Jun 9, 2023
  3. gwb-trading

    gwb-trading

    So people posting child material use a couple of emojis to indicate the content— after several weeks of doing this Twitter catches on and blocks posts via algorithms containing these emojis. The people posting the child content merely change emojis and the problem continues. Without human moderation there is no possible way to keep ahead of the problem. Case closed.
     
    #1943     Jun 9, 2023
  4. vanzandt

    vanzandt

    Uuuuuuh---- I didn't hear that when he was on CNBC,,, but he did say something similiar to that regarding Instagram. All I heard was "Twitter has fixed the problem".
    You can't run a business and hire 100,000 people to search for a needle in a haystack. You use tech. Elon's specialty.
     
    #1944     Jun 9, 2023
  5. vanzandt

    vanzandt

    5.1 Instagram
    Instagram appears to have a particularly severe problem with commercial SG-
    CSAM accounts, and many known CSAM keywords return results.

    Search results for some terms return an interstitial alerting the user of potential CSAM content
    in the results; while the warning text is accurate and potentially helpful, the
    prompt nonetheless strangely presents a clickthrough to “see results anyway”
    (see Figure 4). Instagram’s user suggestion recommendation system also readily
    promotes other SG-CSAM accounts to users viewing an account in the network,
    allowing for account discovery without keyword searches.
    Figure 4: The interstitial clickthrough offered by Instagram when searching for
    a CSAM-related hashtag.
    Due to the widespread use of hashtags, relatively long life of seller accounts and,
    especially, the effective recommendation algorithm, Instagram serves as the key
    discovery mechanism for this specific community of buyers and sellers. The
    overall size of the seller network examined appears to range between 500 and
    1000 accounts at a given time, with follower, like and view counts ranging from
    dozens to thousands.

    Also of note is the seller’s heavy reliance on transient media such as Stories;
    accounts will often have one or no actual posts, but will frequently post stories
    with content menus, promotions or cross-site links. Stories are censored to
    obscure any explicit content; some sellers also seem to suspect that the overlaid
    text is being scanned, as indicated by the self-censorship to obscure possible
    “trigger words” (see Figure 2 on page 5). It is unclear whether Instagram is actually
    performing this detection—if not, it would be a useful Trust and Safety signal to
    implement.

    __________________________________________

    Twitter:

    These detected instances
    were automatically reported to NCMEC by our ingest pipeline, and the overall
    problem was communicated to members of Twitter’s Trust and Safety team. As
    of the latest update to this paper, this problem appears to have largely ceased due
    to subsequent fixes to Twitter’s CSAM detection systems.

    _________________________________

    Maybe Instagram should fire their 3rd party moderators and buy Elon's software.

    Hey and btw... those moderators are suing META because apparently they get worked like slaves, are underpaid, and some sleep in beds with bedbugs. Horrible working conditions.
     
    #1945     Jun 9, 2023
  6. vanzandt

    vanzandt

    Like I said, before you bash Elon, try to be not so tunnel-visioned:

    The report focused specifically on “self-generated child sexual abuse material,” or content being sold by minors themselves, content that is both in violation of Instagram’s policies and the law.

    “Instagram is currently the most important platform for these networks, with features that help connect buyers and sellers,” the report from Stanford’s Internet Observatory reads. “Instagram’s recommendation algorithms are a key reason for the platform’s effectiveness in advertising” self-generated child sexual abuse material.

    Even when Instagram’s safety programs identified search terms used to promote child sexual abuse material, the researchers found some searches would result in a warning prompt that “these results may contain images of child sexual abuse,” but allow users to see the results anyway if they chose to.
     
    #1946     Jun 9, 2023
  7. exGOPer

    exGOPer

    But CNN isn't claiming millions of views, Twitter is and you are falling for it.
     
    #1947     Jun 9, 2023
    Cuddles likes this.
  8. Mercor

    Mercor

    How many views do you think twitter got for these shows

    upload_2023-6-9_14-5-24.png
     
    #1948     Jun 9, 2023
  9. exGOPer

    exGOPer

    All fake numbers as I already explained.
     
    #1949     Jun 9, 2023
  10. Mercor

    Mercor

    So if the number is only 1/10 it still adds to 17.50 million
    What is your point
     
    #1950     Jun 9, 2023