Just like Trump, Musk does not pay his bills. Musk stiffed Twitter vendors and dared them to sue—dozens did just that The ultimate guide to unpaid-bill suits filed against X, Musk's social network. https://arstechnica.com/tech-policy...awsuits-try-to-force-twitter-aka-x-to-pay-up/
I'm sure he had a good reason. They must not have delivered as promised. They will next time though I bet. Or their replacement will.
Go read the article -- it is very lengthy and has all the details. Are you claiming the Musk should not pay his landlords, and all the vendors who completely and fully delivered on what they committed? Musk is a con-artist. This not paying vendors or honoring his commitments is behavior demonstrated by Musk long before he purchased Twitter. He is no different than Trump.
Ukrainian officials call Musk what he is... ‘A cocktail of ignorance and a big ego’: Ukrainian war official slams Elon Musk for disrupting a stealth operation by withholding Starlink access https://fortune.com/2023/09/08/igno...-withholding-starlink-access-walter-isaacson/
Whoever thought the Elon Musk would be a total dick when it comes to co-parenting situations. What a surprise. NOT. Grimes Appears To Beg Elon Musk To Let Her See Their Son In Since-Deleted Tweet The musician was apparently replying to photos of Musk with two of his other children. https://www.huffpost.com/entry/grimes-elon-musk-see-son-deleted-tweet_n_64fa4aeee4b0b755ae1f52d1 Grimes may have just posted and deleted some very distressing things about her co-parenting relationship with ex Elon Musk. Screenshots apparently from Thursday morning show the “Genesis” singer begging for an opportunity to see her son X AE A-Xii, one of the Tesla CEO’s nine living children. The screenshots show Grimes tweeting at Musk biographer Walter Isaacson after his article about Musk came out in Time magazine on Tuesday. The story features photos of Musk with Neuralink executive Shivon Zilis and their twins, as well as a photo of Musk with X. “Tell Shivon to unblock me and tell Elon to let me see my son or plz respond to my lawyer,” the tweet reads, according to the screenshot. “I have never even been allowed to see a photo of these children until this moment, despite the situation utterly ripping my family apart.” Representatives for Grimes, born Claire Boucher, did not immediately respond to HuffPost’s requests for comment. Emails to Twitter’s press office received an automated response that said, “Busy now, please check back later.” Grimes and Musk were first linked after attending the 2018 Met Gala together. They welcomed son X in May 2020 and split in September 2021, only to briefly reunite and have daughter Exa Dark Sideræl, also known as Y, via surrogate in December 2021. Musk’s twins with Zilis were born a few weeks earlier than Y, in November 2021. The “Kill v. Maim” singer referred to Musk as her “boyfriend” in a Vanity Fair interview after Y’s birth, saying, “There’s no real word for it ... We’re very fluid. We live in separate houses. We’re best friends. We see each other all the time… We just have our own thing going on, and I don’t expect other people to understand it.” After the story’s March 2022 publication, Grimes posted that the couple had since split again but that Musk was still her “best friend and the love of my life.” Just weeks ago, other since-deleted tweets attributed to Grimes showed the singer referring to “the single most traumatic events of my life.” “A lot has been going on in my life for the last few years that I have mostly kept from the public,” she went on. “I may still do that out of respect for others, but the last few days rly taught me that without fundamental change I’m gna die from stress and my kids won’t be ok.”
There’s a Word for Blaming Jews for Anti-Semitism Elon Musk’s conceit that Jews cause themselves to be persecuted is as old as anti-Jewish bigotry itself. https://www.theatlantic.com/ideas/a...mitism-anti-defamation-league-twitter/675235/
Elon Musk has shattered the myth social media platforms are mere space providers https://thehill.com/opinion/congres...ial-media-platforms-are-mere-space-providers/ Social media platforms have long argued that they are simply providers of a public forum in which others comment, and thus should not be held liable for what people say there. Elon Musk has single-handedly blown a massive hole in that bogus argument. A new report today suggests that Elon Musk has unprecedented control over content moderation, and personally decided to re-platform Kanye West to X, formerly known as Twitter. It also reveals that Musk ordered his team to make his own posts among the platform’s most visible. The report further details how he told his engineers to tweak the feed of his top venture capitalist pal directly at his request — a feature not available to any of the millions of other users of the platform. With these acts of direct editorial control, Musk has made clear his platform is not a neutral “public forum” and should be held to the same rules as any other newspaper, publisher or broadcast network. For over two decades, social media companies have hidden behind the legal protections conferred by Section 230 of the Communications Decency Act 1996. This legislation, passed before social media companies existed, was designed to make the early “interactive” web manageable. It conferred protections on early web and news sites so they did not have to bear legal responsibility for the content posted on bulletin boards or comment sections, even if they engaged in some content moderation. It was a specific law designed before anyone ever imagined a Facebook, Reddit or TikTok. Nearly a decade later, social media took off as a business in earnest — dispensing with original content and turning the aforementioned “comments” into a business. Social media companies aggregate these posts and repackage them into a tailored news feed designed to be as entertaining and addictive as possible. By interspersing advertisements with comments, they monetize these endless newsfeeds. This simple business model has proven hugely lucrative — 98 percent of Meta’s revenues come from ads and has made Mark Zuckerberg a hundred billion dollars in personal wealth. Hate and disinformation have an advantage in this environment. The repackaging, done by artificial intelligence, is designed to benefit the company by being as addictive as possible, exploiting psychological triggers — benefitting content that enrages or makes us want to react by posting more content ourselves. The algorithms are also tailored to promote the owners of these companies and the values, politics and ideas that benefit them the most, as Musk has so explicitly demonstrated through his actions. Others have done the same. For example, Mark Zuckerberg reportedly approved an internal initiative — Project Amplify — to promote positive stories about Facebook. They are unequivocally, therefore, publishing companies, and what the user consumes is a result of decisions taken by executives for their own benefit — economically or politically. And yet thanks to the “get out of jail free” card of Section 230, enacted eight years before Facebook was even started, these companies cannot be held liable as publishers in any way for the hate, antisemitism, and disinformation that they push to billions. No other person or company in America is free from accountability or responsibility for its core product in such a way. It is clear from the research my organization, the Center for Countering Digital Hate, publishes, that social media can be harmful. Extremists openly proselytize, recruit, finance and plan operations on these platforms with little intervention. Algorithms promote dangerous eating disorders and self-harm content to teenagers. Algorithms cross-fertilize conspiracist movements, giving QAnon followers anti-vaxx content and vice versa. Trolling and abuse are rife on these platforms, forcing women, minorities and LGBTQ+ people to restrict their own posts so as to avoid a torrent of abuse. In 2022, we gathered lawmakers from the U.S., UK, Canada, Australia, New Zealand, and the European Union to talk about how we might develop a set of laws that would allow us to hold these companies accountable. We all agreed that social media companies quite clearly have a significant impact on our psychology, especially that of our children, our communities, our politics and even the values that underpin our democracy. At the end of the conference, we published our STAR Framework, which provided a comprehensive set of minimum standards for an effective regulatory framework that balances free speech with human rights. The framework demands Transparency of algorithms, decisions on how companies enforce their “community standards,” and how commercial advertising shapes the content it presents, which would allow for meaningful Accountability to the public. It also requires companies to take Responsibility for avoidable, predictable harms they failed to deal with, which we hope will lead to a culture of Safety by Design. Since that meeting, the European Union has passed a Digital Services Act, and the United Kingdom is shortly expected to pass an Online Safety Act that seeks to balance corporate rights with human and civil rights. The United States is unique in having failed to do so. It is time Congress stops vacillating and starts acting. Our kids’ mental health and body image, the safety of our communities, the rights of vulnerable and minority groups, and even our democracy itself demand better. Hiding behind the notion that these vast companies are simply “free speech” platforms rather than publishers who shape public knowledge to their agenda is simply untenable in the face of reality. Elon Musk, through his brazen stewardship of X, has ironically, made the best possible case for the STAR Framework than we could ever have done alone.
This is great if it applies to all, not just Twitter(X). Right now, it is the only platform, which has transparent algorithms. The framework demands Transparency of algorithms, decisions on how companies enforce their “community standards,” and how commercial advertising shapes the content it presents, which would allow for meaningful Accountability to the public. It also requires companies to take Responsibility for avoidable, predictable harms they failed to deal with, which we hope will lead to a culture of Safety by Design.