The Future of Computer Technology

Discussion in 'Hardware' started by aphexcoil, Dec 21, 2002.

  1. Well...then come back tomorrow...dude...

    Oh yeah, and Happy New Year back to T4A and to you and everyone else, and thanks again for first recommending the Penrose book to me however many aeons ago.
     
    #31     Dec 31, 2002
  2. KymerFye,

    Yes, long-term biological evolution certainly contributed much to intelligence. Yet a newborn creature is a remarkably blank slate. If that newborn is not allowed to interact with the world, it will never develop the nifty cognitive skills that we call intelligence (or even basic functioning). Indeed, the brain is so plastic that even lower level functions like vision are NOT hardwired into it. For example, if you prevent a kitten from interacting with the world, even if you show it moving images of the world, the kitten's brain never learns to see and the animal will grow up totally blind. What evolution has done is create a powerful adaptive learning system that is primed, but not preloaded, to learn vision, language, motor control, intelligence, etc. Indeed the system is so unspecialized that the system can reroute around damage and borrow neurons in unrelated systems (e.g. use the visual cortex to process auditory signals).

    As to interest in creating "intelligent" computers, the cost (and value) of human labor produces a powerful economic, social, and personal incentive to off-load tasks from people to computers. Companies would love to replace their costly call center personnel with Dell servers (complete with Positive People Personalities from the Sirius Cybernetics Corporation). And, I'd bet that most people would jump at having a "personal secretary" that takes care of all the routine administrivia of daily life (taxes, bills, coordinating schedules, finding the right gift, responding to personal letters, posting responses on ET, etc.). It is then a small step to go from reactive, passive computer-based intelligent systems to proactive, autonomous computer-based intelligent systems. I would love a computer that automatically spots and resolves billing disputes with my phone company.

    You are right about people becoming cyborgs. I know that my Psion 5MX is like the 3rd hemisphere of my brain. And you probably are right that a computer-augmented person will always beat a stand-alone computer. The interesting scenarios occur when the computer-augmentation side is given some autonomy to solve small problems by itself (e.g., resolve a billing dispute or make the "best" travel arrangements). Then the scenario starts looking more like the "personal secretary" concept than a cyborg.

    As to whether machines start reproducing, that has already occurred. Software publishing, software piracy, and computer viruses are all examples of the reproduction of the "fittest" bits of code. The notion of machines physically reproducing seems less likely, although increasing factory automation at Intel and Dell brings that scenario closer to reality.

    Thanks for the interesting ideas!:)

    Trade well,
    Traden4Alpha
     
    #32     Dec 31, 2002
  3. nitro,

    That is another area where Penrose is mistaken. My wife has studied cognition and creativity in inventors and scientists (she's not looked at mathematicians, so maybe they have quantum mechanical brains). The "Eureka" moment of discovery is a misperception. If one studies the lives and works of these people, one invariably finds that what looks like a quick flash of insight is actually backed by a rather long methodical process of incremental thoughts and prior discoveries. (In one example, a scientist "discovered" something 2 years after they had written the answer in their lab notebook.)

    That people experience the solving of a problem as swift stroke probably says more about the relationship between the conscious and unconscious mind than it does about how the brain actually solves tough problems. The unconscious mind plods incrementally along on a problem and when it finds a solution the result bubbles up into consciousness in what seems like a short period of time. No quantum mechanical instantaneous exploration of an infinite number of paths is needed.

    I'm looking forward to your other comments, when you have time.

    All the best,
    Traden4Alpha
     
    #33     Dec 31, 2002
  4. nitro

    nitro

    YW.

    nitro
     
    #34     Jan 1, 2003
  5. The only good thing about modern computers (meaning hardware and software) for me is I can trade. But in business, other than that, they are way slower than my IBM 286 machines running simple IBM business software. Case in point in my view is a word processing program that was called IBMPTS. Sure it can't do anywhere near what Word and Wordperfect can do but it can create a fantastic looking business letter in a fraction of the time. And man do I miss those fonts on the old IBM printers. That face and those ribbons would produce the darkest black type you ever saw - not these meek letters that modern printers spew out. Training a new employee took 15 minutes at most. Anybody ever get stuck with an employee that doesn't know Word? What did that cost you? Production wise, if you tended to use variations on the same forms and letters, I'd guess it was 5 to 10 times faster than the above programs. The filing system was great and you could find your stuff in a fraction of the time it takes to go through Word. Even after the modern age started, I was buying up wheelprinters and storing them in garbage bags in the attic. Then there was that awful day the suppliers, Quill and Lexmark, told me the ribbons (printwheels they called them then) were being discontinued. In the end, I remember wire transferring 250$ to some office supply place in Denmark to buy what weeks of searching on the net told me were the last ten ribbons in the world. Ahh, and at the end of the day, you didn't "exit programs" -you reached over and turned the switch off - just like the lights.....

    Geo.

    Course I found those printwheels in Denmark using a Compaq and the internet! I'll admit that.:cool:
     
    #35     Jan 1, 2003
  6. Well, I'm not sure you fully got the point about biological imperatives, and it's probably my own fault for being too elliptical. Unfortunately, exploring all of the different ways that the functioning of the brain - from the constitution and early development of neurons, through the behavior and organization of "mid-level" autonomic processes, to the generation and emergence of thoughts in the mature human mind - appears to recapitulate process of natural selection, and then further speculating on the potential implications of this perspective vis-a-vis artificial intelligence projects, would require more space and effort than I can or should, I think, attempt on this thread.

    If you're serious about investigating this aspect of the subject further, there is much material available. You might begin by looking through discussions of Neural Group Selection theory and Minsky's ideas about the "society of mind." I have found in my own reading and reflection on the subject that ideas from many different disciplines often come into play at once, even when one is attempting to focus merely on some seemingly simple part-process. I also think you might benefit from giving Penrose's arguments another look, as I don't believe your summary does them justice.

    I also have enjoyed this discussion. Maybe we'll find some other place to continue it at some other time.
     
    #36     Jan 1, 2003
  7. CalTrader

    CalTrader Guest

    For typical business computing most of the new program releases and OS releases just are not necessary. In fact, freeware like Linux etc on modern hardware provides a lot of computing power.
    Standardization does however cut training costs: most people know how to use word unless they have no computer experience.
    It does not however cut licensing and hardware costs if a company needs to stay "current."

    The computer industry is based upon "churn". That is a continuous stream of upgrades and thus charges. This includes training. The non freeware software companies are now all changing their licensing mechanisms to grab more revenue since the "churn" rate in the computer industry has slowed. It remains to be seen whether these changes will drive small and midsized businesses to freeware alternatives.
     
    #37     Jan 2, 2003
  8. Josh_B

    Josh_B

    A "Trillion" Computers In A Single Drop Of Water

    http://www.spacedaily.com/news/nanotech-01o.html



    R&D 2002: Quantum Computing

    Microsoft Research’s theoretical mathematics program breaks the mold with research into a strategy to make quantum computing work.

    http://www.technologyreview.com/articles/voss1202.asp


    Are we closer to a Quantum computer according to the above article?

    Previously: astonishing speed but also problems in getting the info out..

    ...Hence with one fell swoop, one tick of the computer clock, a quantum operation could compute not just on one machine state, as serial computers do, but on (2 to 500th power) machine states at once! Eventually, however, observing the system would cause it to collapse into a single quantum state corresponding to a single answer, a single list of 500 1's and 0's, as dictated by the measurement axiom of quantum mechanics.

    good overview with additional explanations: http://www.cs.caltech.edu/~westside/quantum-intro.html#power

    There could be some interesting implications on trading systems alone..

    Josh
     
    #38     Jan 26, 2003