personal back-testing setup

Discussion in 'Hardware' started by sle, Jan 1, 2018.

  1. sle

    sle

    In cross-sectional cases where there are relatively few assets and a lot of data, it's fairly easy to chunk it (many commercial/OS libraries do that for you). In longitudinal cases (time series to time series, that's where I get these large matrices) it naturally becomes a "special case" since I only care about relatively few principal components. There are various tricks that have been worked out by people who deal with really big datasets (e.g. genomics and image analysis guys). It boils down to doing some sort of random subsampling or a "compression" of your matrix - I think I am using a compression type algo, don't remember where I got it from.
     
    #21     Jan 2, 2018
  2. Mysteron

    Mysteron

    BS
     
    #22     Jan 2, 2018
  3. sle

    sle

    You have a better numerical recipe you'd like to suggest, I take it?
     
    #23     Jan 2, 2018
  4. Mysteron

    Mysteron

     
    #24     Jan 2, 2018
  5. sle

    sle

    I was specifically asked by @truetype how I manage to perform SVD on very large matrices to get principal components. I gave a general answer because (a) I don’t know enough details as I am using someone else’s code and (b) this particular calculation is not particularly important for me any more as that strategy has bled out and died. If that failed to impress, I am sorry and will try harder the next time :)
     
    #25     Jan 2, 2018
    IAS_LLC likes this.
  6. sle

    sle

    So, stupid question - with the new hardware exploit, Xeon chips should be cheaper now, right?
     
    #26     Jan 5, 2018
  7. Easy_Test

    Easy_Test

    I built back test tool using Python, and VBA. It's really easy to use, and when you finish running the back test the data is in Excel so you can review every single trade to make sure the test went perfectly. I have been using it with my SPX data that goes back to 1996. If someone has stock data to test earnings trades, I'm up for a trade.

    business1031e@yahoo.com
     
    Last edited: Apr 6, 2018
    #27     Apr 6, 2018
  8. No they are not . Just because something turned out to be imperfect which everyone assumed perfect till now does not affect demand,especially as there really is no replacement product that does better.

    Overall, I followed this thread with interest but I am completely confused what you are trying to achieve. You talk about certain budgets but then at the same time talk 1TB of ram. Are you aware that only 128gb memory costs around 900 to 1000 USD, even used?Also do you mind elaborating why you need so much memory? I have been in the quant business for over 16 years now and it's not clear to me which back test or matrices would require so much memory. I also consider myself pretty proficient in ML and DNN and have not come across any task that would require such amount ram, especially not for models and data sets that you described. If you look for serious advice it would be useful to give more specifics and drop the mumbo jumbo as another user rightly pointed out. Not trying to be condescending just reflecting an honest impression.

     
    #28     Apr 11, 2018
  9. IamaMars

    IamaMars

    Wow, that's some of the setup, then make sure you are go for fastest DDR4 possible for your motherboard as well as something like NVMe for SSD and let it be on PCI-Ex4, that way it will be really fast, I do not understand why people not use this things.
     
    #29     Jun 10, 2019