The Future of Computer Technology

Discussion in 'Hardware' started by aphexcoil, Dec 21, 2002.

  1. The software is so far behind the Hardware now, I fully expect to own my 1800 AMD, 768Ram, and never upgrade it until, maybe when I buy a new computer 4 years from now. I might drop another 512 Ram chip in the box, and take it up to a gig of Ram, but noway will I use up 80 gigs of Harddrive. By the way, if you are below 500 MHZ of ram go and and buy RAM its cheap right now.
     
    #11     Dec 23, 2002
  2. http://www.transtopia.org

    Ok, so a few weeks ago I stumbled upon this link to a website for a cult. These people basically believe that because computer power is growing exponentially, in the next 20-50 years we'll see technology grow at a rate beyond our imaginations. They believe technology will accelerate to a climax when everything changes into a Terminator/Matrix movie or nanotechnology accident. They call this time the singularity. Anyway, they are basically waiting for the singularity, and in the meantime advocate complete hedonism and getting as rich as possible through "high yield investments and Multi-level marketing."

    Now I originally browsed through the site for a good laugh, but I'm rather ashamed to admit these people have some interesting philosophies. If you're into philosophy then you'll find these people's elaborately explained views very stimulating, whether you agree with them or not.

    I thought their predictions were absurd at first, but the more I look into it, the more I believe that in the near future advanced computer technology will change everything.

    Browse through the site. It's really interesting. Very libertarian. Very different from anything you've ever seen.

    Common boys! It's cult time!
     
    #12     Dec 24, 2002
  3. Be careful there. 80 gigs is really not that much as space goes. Oh sure, maybe you can account for open area now. But if you plan on keeping pace with all the stuff that's out there, well just wait. Movies online, songs, ever increasing sizes of basic programs. I can remember when we thought a 20 gig drive should be adequate for all of the things that were essential. I am now looking at my server that I just added two 120 gig drives to and thinking aloud that I now actually have a terabyte of storage space. Too cool! :)
     
    #13     Dec 24, 2002
  4. Happy holidays to you and yours too! As a matter of fact, may the New Year bring better trading opportunities to you all! :)
     
    #14     Dec 24, 2002
  5. 1. Snide comment: MSFT will gladly absorb every CPU cycle and byte of memory that you throw at them. Do computers of today really feel that much faster than those of yesteryear (except in being able to handle multimegabyte image files)

    2. Low value applications rule: Cheap computing and storage lets me take and keep thousands of 5 megapixel photos of friends, family, and travel destinations. Do I really need this? Do I really need to walk around with thousands of songs on an iPod? Some may claim that computing and storage are wasted on too many low-value applications. I say that are real power of cheap computing is in its ability to improve an ever-widening list of low-value applications.

    3. Software vs. hardware: Which is cheaper: to take another 10% out of the software or add another 10% performance to the hardware. Sadly, the productivity of software is NOT subject to Moore's Law. Thus it is economically far more efficient to boost computing performance, than it is to boost computing efficiency. (It's also an excuse for sloppy code, but that's another story.)

    4. IRAM: Interconnect speeds don't follow the same scaling -- bus speeds cannot keep pace with either the growth in storage or the growth in clock speeds. If you look at the history of computer you see that early machines had a 1:1 memory to CPU connection, then they started adding wait cycles, then they went to clock doubling, and cache, etc. The ratio between clockspeed and memory speed gets worse and worse every year. One solution is called IRAM (Intelligent RAM) in which a few million of the hundred some-odd million transistors on a RAM chip are used as a CPU. The result is a multiprocessing architecture with very fast memory access.

    5. Computer Power > Human Brain: Besides the cult, you might want to check out "Mind Children" by Hans Morovec. This is a more serious look at the issue of when/if computers become smarter than people. BTW, IBM's ASCI Purple will, at 100 teraflops be approaching the lower estimates of human brain-equivalent computing power.

    Happy Holidays,
    Traden4Alpha
     
    #15     Dec 24, 2002
  6. CalTrader

    CalTrader Guest

    Point 4 has always been around: Like point 3 its a cost issue. Its cheaper to build around commodity chips than reengineer the systems.

    Points 3 and 5 go together: that is you can build a very fast architecture but without high performance and highly innovative software, all the speed and bandwidth only incrementally improve problem solution. The key point is item 3. Few people know how to write effective and innovative code for even the existing high performance computing systems. The issue in these systems is a lack of understanding of how to effectively exploit the power: there is a vacuum of well understood algorithms.

    The existing classes of algorithms simply do not work well in high performace systems and IMHO the answer - which is beginning to emerge as consensus - is that fairly radical new approaches need to be employed to advance the state of the art. I think we will see quite a revolution over the next ten years on how we handle the implementation of high performance software: a lot of these ideas will filter down to ordinary business computing.
     
    #16     Dec 27, 2002
  7. Yeah, I just got Ghost Recon for Xmas, it takes 1.5 gigs of HD space. OMG - thats it for the games for a while, LOL.:cool:
     
    #17     Dec 27, 2002
  8. Hard-Drives are still the weakest link in my opinion. We're still using a mechanical device to retrieve and store information from a magnetic platter.

    Even the slowest electronic memory would run rings around today's hard-drives.

    I'd like to see some type of electronic storage device with a terabyte of storage, access times in the nanosecond range and data-transfer rates of over 10 gigs a second.

    There is no good reason why I should have to wait 30 seconds between the time I press the ON button and the time Windows XP is ready to rock'n'roll.
     
    #18     Dec 28, 2002
  9.  
    #19     Dec 28, 2002
  10. Those estimates of human brain computing power are based on unproven, highly dubious theories and assumptions about the functioning of the brain. If you haven't done so already, you might consider taking a look at the "quantum consciousness" work by Penrose and Hameroff for some alternative perspectives. Those authors and most serious cognitive scientists will admit that no one knows how the brain computes, if "compute" is really what it does.
     
    #20     Dec 28, 2002