AI has arrived, at least publicly

Discussion in 'Economics' started by Illum, Aug 18, 2011.

  1. Illum

    Illum

    I have heard of computer programs that could adapt, but now its in a chip. They claim, learn and problem solve, come up with its own solutions. Military interested of course.

    How many more "thinker" problem solvers will soon be out of work?
    -----------------------------

    IBM unveils chips that mimic the human brain

    IBM has unveiled a new experimental computer chip that it says mimics the human brain in that it perceives, acts and even thinks.

    It terms the machines built with these chips "cognitive computers", claiming that they are able to learn through experience, find patterns, generate ideas and understand the outcomes.

    n building this new generation of chip, IBM combined principles of nanoscience, neuroscience and supercomputing.

    It has been awarded $21m (£12.7m) of new funding by the Defense Advanced Research Projects Agency (DARPA) for the next phase of the project, which it terms "Systems of Neuromorphic Adaptive Plastic Scalable Electronics" (SyNAPSE).

    "This is a major initiative to move beyond the von Neumann paradigm that has been ruling computer architecture for more than half a century," said Dharmendra Modha, project leader for IBM Research.

    Modha added that the chip may see applications in business, science and government.

    "Future applications of computing will increasingly demand functionality that is not efficiently delivered by the traditional architecture.

    "These chips are another significant step in the evolution of computers from calculators to learning systems, signalling the beginning of a new generation of computers and their applications in business, science and government."

    IBM states that the chips, while certainly not biological, are inspired by the architecture of the human brain in their design. Digital silicon circuits make up what it terms the "neurosynaptic core".

    The scientists have built two working prototype designs. Both cores contain 256 neurons, one with 262,144 programmable synapses and the other with 65,536 learning synapses. The team has successfully demonstrated simple applications like navigation, machine vision, pattern recognition, associative memory and classification.

    But what are the potential real-world applications of this technology? Tsunami warnings for one, claims IBM:

    "A cognitive computing system monitoring the world's water supply could contain a network of sensors and actuators that constantly record and report metrics such as temperature, pressure, wave height, acoustics and ocean tide, and issue tsunami warnings based on its decision making," said IBM in a statement.

    Going slightly more Minority Report, IBM goes on to suggest an instrumented glove that a grocer could use to flag bad or contaminated produce.

    On a more practical note, IBM has said that this technology could result in computers that take up far less space and use less power than those in use today.

    http://www.computing.co.uk/ctg/news/2102735/ibm-unveils-chips-mimic-human-brain
     
  2. I don't see a whole lot of any amazing new architecture here. The basic practice of putting neurons on a silicon chip has been addressed decades ago by Carver Mead and staff at CalTech. Nor would I suspect that the underlying circuits are all digital.

    The key, IMO, is more of the what are the underlying algorithms being utilized and what do they do exactly-- that's the challenge.
     
  3. As dtrader98 says, there does not seem to be anything radically new here, though there probably are some advances not obvious from the article.

    You can do this stuff in software, but the main point of doing it on special purpose silicon is that it will be MUCH faster.
     
  4. Not quite yet. If you REALLY want to read something thought provoking...go check out a book named, "Forbidden Gates" by Dr. Tom Horn...you will NOT be the same after reading it! You are looking for something called the "Singularity" - Horn discusses it in the book. The word "GRIN" will never mean the same to you...Genetics, Robotics, artificial Intelligence, Nanotechnology.

    http://www.amazon.com/Forbidden-Gat...=sr_1_2?s=books&ie=UTF8&qid=1313735649&sr=1-2


    -gastropod
     
  5. Humpy

    Humpy

    They reckon Washington could soon be implanting these chips into Govt departmental staff.

    More neurons per moron
     
  6. Illum

    Illum

    I'll check it out. I hear people talk about singularity, but I have to be honest, mostly what I hear is silly youtube vids, McKenna etc... I can though, see for myself how our knowledge is growing at an exponential rate. If we could just stop being animals for 2 secs we might.... get there.
     
  7. ROFL! :D
     
  8. As a forewarning...that book does have a "religious" inclination to it. I personally do "believe," so I find that OK. Some who do not believe...may not like the inclination. If you do check out the book...the Singularity will have a TOTALLY different meaning to you. I agree...a lot of what you see is "kiddies" on youtube. That book will describe the more state of the art in the various technologies.

    Ciao,
    gastropod
     
  9. vanv0029

    vanv0029

    Among philosophers of Computer Science AI is seen as a dogmatic
    religion. The skepticism is coming to a head because 2012
    is the 100th anniversayr of Turing's birth so there are numerous conferences
    mostly in the UK. The logic underneath AI such as Zermelo Fraenkel
    Set Theory and Von Neuman's formalization of Quantum physics
    are also probably wrong. Some references: "The Trouble with Physics"
    by Smolin, "The Emperors New Mind" and "The Shadows of the Mind"
    by Penrose. Also the yearly International Association of Computing
    and Philosophy Conference proceeding - IACAP Conference.