Here you can find the article which Thomas Sterling has mentioned during his interview with supercomputingonline.com: http://www.sciam.com/article.cfm?articleID=000E238B-33EC-1C6F-84A9809EC588EF21
Maybe this will help a bit, go here: http://www.scyld.com I built my own Beowulf cluster for a pet project (handicapping thoroughbred's) about a year ago with this OS. The user community is vast, friendly and helpful. The main cluster that I have was built on a pentium III multi-processor platform. Currently it uses four machines and operates beautifully. I also have a three machine (compaq proliant servers, dual processor based) Beowulf cluster for other side uses. I bought the machines through eBay for about $350 each. Maybe this project will provide a nice challenge for your skillset.
What are you doing that would require this type of computing resource ? We process a lot of data and we dont need to use resources that you suggest: Most of our work can be done on small arrays of commodity boxes using our own proprietary software. Tells us exactly what you are proposing to do and then maybe someone can point you in the right direction. My lab computers approximate the capacity of the supercomputers I used in 1980 ..... Maybe you need a better algorithm .... Yes you can rent time: You may need to wait though. Hope you have a lot of dollars to pay for the time. 12 years ago I paid 300 dollars per hour plus additional transaction costs. My proposal for using the machines was justified by peer review ....
Perhaps I do need a better algorithm. I don't have the time to refine an algorithm, though -- so if it isn't precise than hopefully the brute force will just cover any small inefficiencies. I have a couple projects in mind. I want to run a lot of volatility studies, test several systems and test out a theory I have on chaos theory (but the last one isn't really important). A lot of it is statistical analysis based on specific parameters. I don't need to do it right now, but I was curious what the cost was. I don't know how many "megaflops" my Pentium IV is -- but perhaps a few gigaflops is all I need. I don't know, since I don't have anything to compare it with. Worse case scenario -- it tears through all my analysis and I have an extra half-hour to play Quake.
dont you think the big mutual funds with unlimited money have already done this?and yet most cant even beat the s&p.something to think about.
Listen I am not really sure what you'er trying to program in, but from what I understand you have huge lumps of data that need to be analyzed. I am sure there is a way to do that on you regular pc computer if you code it right. to start with try segmentizing the set of data into matrices and then having a matrix of matreses that pass-on major variable as you see fit. basically, if you're goign to do some sort of regression analysis over several factors, try to do it through multi dimensional arrays if you don't know how to program it into C++. I think you should work on the logarythm first and then spend bucca cash on "supercomputers", and if you algorythm is inifficient even the supercomputer won't be able to crunsh it. AND.. forget VBA if that's what you're using. Good luck. M.
Dead on. Size matters. Sometimes it's a hindrance. Remoras thrive on stuff that couldn't sustain sharks.
I talked it over with the rest of us and we all agreed that we don't like to think... Hurts our head...