If this is absolutely true this news can be even deeper than deepseek news and it could be something that can bring a gigantic chaotic whipsawing event to all of AI as we know it..... Someone experimented with a 1997 processor and showed that just 128 MB of RAM is enough to harness the power of AI. https://farmingdale-observer.com/20...-of-ram-is-enough-to-harness-the-power-of-ai/
Get real. You really believe 128 Mb of RAM has usage in modern computing? You can't even load a decent Linux kernel with that little RAM. Deepseek was itself a mirage. They were using smuggled H100s from Singapore. You really expect the Chinese to tell you the truth? People on ET will believe anything written on the net. Just like everyone was sure the hyperscalers would magically make a better GPU than nVidia, you know the company who created the industry & has 25 year head start on everyone.
Always read such news with a pinch of salt. Then you will never be conned / scammed. And you could con the conman / scam the scammer.
So basically more media lies.... Dang it Guess they will have to pay $50,000 a pop for those blackwells
Running AI models vs Training models requires a big difference in compute power. The article says the LLM they ran had 260,000 parameters. I have no idea how useful such a small LLM would be, so I asked the AI: A language model with 250,000 parameters is extremely small by modern standards and would generally not be considered "good" for most real-world NLP tasks. Here's why: Modern LLMs like GPT-3, GPT-4, Claude, or LLaMA models typically have billions (and even trillions) of parameters. A model with 250k parameters is more like a toy model—useful for: Educational purposes Testing architectures Running on extremely low-resource environments (e.g., microcontrollers) Very simple or narrow tasks (like basic text classification, rule-based chatbots)