But that's the point, my friend. An AI shouldn't make a determination of who or what is worthy. The moment it does, it is biased. Then someone just has to create an AI that is conservatively biased, and we're back at the beginning again - just replace "people with narratives" with "AI with narratives".
Oh I concede everything has a bias. Plants are biased to grow toward the sun, birds are biased to migrate south, etc. Facts are biased to the truth. These are hard things when we are on the side of the shade or the cold north or a lie. The point was that chat gpt is biased against republicans but it seems it’s biased against Trump and his ilk, ie lil rocket man and that sort. Which is a big difference. My guess is that chat gpt is taking into account all of Trump’s ills and rightly putting him in league with failed wannabe dictators for the January 6th fiasco and stolen election lies. How you all reconcile that is y’all’s business. Many people still deny January 6th was an insurrection and believe skinny Joey sold democrats a million votes in Pennsylvania.
You can theorize whatever you'd like as to why the AI says what it says, but as an AI that is supposed to function in an unbiased manner, it shouldn't do it at all. Plants and birds and other examples - that's not bias in the sense we are speaking of. Please don't obfuscate. That's more evolutionary bias, not conversational. It should merely carry out the task its human master asks it to do. I suspect I know how you'll reply to this, but hopefully I'm wrong.
I don’t care about chat gpt per se. I just ran a little experiment because I found it interesting and posed a hypothesis. As I said everything will have a bias. But should AI have a bias is a big question. As AIs grow I’m sure they will some role in things like food and water. I kind of would like for it to not merrily follow some order to poison humans.
That's exactly where I thought you'd go - some human orders an AI to commit an illegal action. That's no different than some human using a tool to commit an illegal action (gun to kill someone, hammer to smash someone's windows, etc). There are laws already in place that would/should prohibit the illegal use of an AI like there are laws to prohibit an illegal use of any other tool. The difference here is that this tool is designed to provide information, and it shouldn't be used to censor that information to provide a slant or narrative based on what its creators believe. Let's see if I can guess what you'll say next using TsingGPT.
To be clear I have no skin in this game. I couldn’t care less if chat gpt stays up or goes down, if it is biased left or right. I thought it would be amusing to check if it was biased against republicans. It was not. Go check I didn’t find any other republican it would not write an ode to, well I did not check Denny Hassert so that may not be true. Now we are getting into where there should be a law or regulation because you don’t see the difference between a gun and knife so you think your posit is reasonable and sound. It’s not! AI is not a screwdriver just as a nuclear weapon is not a gun. I will tell you what I think of all of this and it’s the same for most things. I see issues like AI the same as nuclear energy, it can light up the world or destroy it. What will make the difference is how we as people use it. Some people will advocate we make chat AI so restricted as to not offend anyone and others will demand chat AI be so open that everyone is offended. That really doesn’t matter to where AI is going. It’s going nuclear. I care more about the bigger issues associated with it. This stuff isn’t twitter.
Perhaps but bias toward what, the truth? This is an issue about trump as far as I can tell, not republicans.