The ORATS thread

Discussion in 'App Development' started by TheBigShort, Feb 28, 2019.

  1. I finally pulled the trig with ORATS (bummed off their free trial twice, was new to programming so..), I love what Matt is doing and I think more people should be using his product. I have compared the ORATS data with Bloomberg and it's almost spot on with their BVOL(arbitrage free) implied volatility surface. I actually think it is Bloomberg that is a little bit off rather than ORATS.

    Anyways, I have a ton of questions and although Mat has answered all the questions I have asked him, I have some entry level programming questions that should be asked else where. As most of you know I am a big fan of R but have recently started learning Python the last 2 weeks (this has actually made me love R even more).

    Here is a link to their api documentation. and here is the subscriptions you can get. apiorats.PNG

    token = ********
    x = GET("", add_headers(Authorization=token))
    core = content(x)
    frame =
    [1] "data.ticker"                 "data.tradeDate"              "data.assetType"              "data.priorCls"               "data.pxAtmIv"              
      [6] "data.mktCap"                 "data.cVolu"                  "data.cOi"                    "data.pVolu"                  "data.pOi"                  
    [11] "data.orFcst20d"              "data.orIvFcst20d"            "data.orFcstInf"              "data.orIvXern20d"            "data.orIvXernInf"          
    [16] "data.iv200Ma"                "data.atmIvM1"                "data.atmFitIvM1"             "data.atmFcstIvM1"            "data.dtExM1"              
    [21] "data.atmIvM2"                "data.atmFitIvM2"             "data.atmFcstIvM2"            "data.dtExM2"                 "data.atmIvM3"              
    [26] "data.atmFitIvM3"             "data.atmFcstIvM3"            "data.dtExM3"                 "data.atmIvM4"                "data.atmFitIvM4"          
    [31] "data.atmFcstIvM4"            "data.dtExM4"                 "data.iRate5wk"               "data.iRateLt"                "data.px1kGam"              
    [36] "data.volOfVol"               "data.volOfIvol"              "data.slope"                  "data.slopeInf"               "data.slopeFcst"            
    [41] "data.slopeFcstInf"           "data.deriv"                  "data.derivInf"               "data.derivFcst"              "data.derivFcstInf"        
    [46] "data.mktWidthVol"            "data.mktWidthVolInf"         "data.cAddPrem"               "data.pAddPrem"               ""                  
    [51] "data.ivEarnReturn"           "data.fcstR2"                 "data.fcstR2Imp"              "data.hiHedge"                "data.loHedge"              
    [56] "data.stkVolu"                "data.avgOptVolu20d"          "data.sector"                 "data.orHv1d"                 "data.orHv5d"              
    [61] "data.orHv10d"                "data.orHv20d"                "data.orHv60d"                "data.orHv90d"                "data.orHv120d"            
    [66] "data.orHv252d"               "data.orHv500d"               "data.orHv1000d"              "data.clsHv5d"                "data.clsHv10d"            
    [71] "data.clsHv20d"               "data.clsHv60d"               "data.clsHv90d"               "data.clsHv120d"              "data.clsHv252d"            
    [76] "data.clsHv500d"              "data.clsHv1000d"             "data.iv20d"                  "data.iv30d"                  "data.iv60d"                
    [81] "data.iv90d"                  "data.iv6m"                   "data.clsPx1w"                "data.stkPxChng1wk"           "data.clsPx1m"              
    [86] "data.stkPxChng1m"            "data.clsPx6m"                "data.stkPxChng6m"            "data.clsPx1y"                "data.stkPxChng1y"          
    [91] "data.divFreq"                "data.divYield"               "data.divGrwth"               "data.divDate"                "data.divAmt"              
    [96] "data.nextErn"                "data.nextErnTod"             "data.lastErn"                "data.lastErnTod"             "data.absAvgErnMv"          
    [101] "data.impliedIee"             "data.daysToNextErn"          "data.tkOver"                 "data.etfIncl"                "data.bestEtf"              
    [106] "data.sectorName"             "data.correlSpy1m"            "data.correlSpy1y"            "data.correlEtf1m"            "data.correlEtf1y"          
    [111] "data.beta1m"                 "data.beta1y"                 "data.ivPctile1m"             "data.ivPctile1y"             "data.ivPctileSpy"          
    [116] "data.ivPctileEtf"            "data.ivStdvMean"             "data.ivStdv1y"               "data.ivSpyRatio"             "data.ivSpyRatioAvg1m"      
    [121] "data.ivSpyRatioAvg1y"        "data.ivSpyRatioStdv1y"       "data.ivEtfRatio"             "data.ivEtfRatioAvg1m"        "data.ivEtfRatioAvg1y"      
    [126] "data.ivEtFratioStdv1y"       "data.ivHvXernRatio"          "data.ivHvXernRatio1m"        "data.ivHvXernRatio1y"        "data.ivHvXernRatioStdv1y"  
    [131] "data.etfIvHvXernRatio"       "data.etfIvHvXernRatio1m"     "data.etfIvHvXernRatio1y"     "data.etfIvHvXernRatioStdv1y" "data.slopepctile"          
    [136] "data.slopeavg1m"             "data.slopeavg1y"             "data.slopeStdv1y"            "data.etfSlopeRatio"          "data.etfSlopeRatioAvg1m"  
    [141] "data.etfSlopeRatioAvg1y"     "data.etfSlopeRatioAvgStdv1y" "data.impliedR2"              "data.contango"               "data.nextDiv"              
    [146] "data.impliedNextDiv"         "data.annActDiv"              "data.annIdiv"                "data.borrow30"               "data.borrow2yr"            
    [151] "data.error"                  "data.confidence"             "data.updatedAt"             
    and just like that we have all this data!!(there is a ton more data as well)

    I have a few questions that I hope some of you can answer.
    From the subscription it says I am only able to do 2000 calls a month to the api. Do any of you know (if there is a way) to do one call which grabs the data for lets say 500 companies? If I put it into a loop (using "lapply" in R) each call would result in 1 request and therefor I could only view the data for the SPX companies 4x a month (500 calls 4x = 2000 monthly calls)!!
    They used to have a request that included {stocklist} in the call but I have not seen that in the new documentation & I tried it with no luck. Here is the old documentation for their api

    Lastly it looks like ORATS data is 15 min delayed (I have no issue with that) but does anyone want to correct me on this? Thanks so much guys and I would love to hear your reviews on ORATS! I'll post my code for my complete scanner once it's done but in the mean time, hopefully some of you can help me along the way.

    Ps. If you are just looking for historical options data. This is the best thing you can buy in my opinion for under $50
  2. Going through the orats documentation is raising some questions about calculating the implied move using the straddle px (as orats extracts the earnings moves from straddle px's stated in their documentation).

    There is a thread here between Kevin and I last year where he explains it good.

    I will be using two examples from ORATS. My understanding is the straddle px (assuming less than 5 bus days to expiration) is a quick and dirty way to calculate what the implied move will be. This is based on the fact that the earnings move will be a majority of the terminal distribution when there is less than 5 bus days to go. Implied move is defined as Abs(Ln(StockCloseT1/StockCloseT0)).

    So if a stock is $100 and the ATM straddle px with 3 Dte is $5, the implied move is roughly %5 (biased high due to non earnings vol also in the straddle px).

    CRM(salesforce) releases earnings monday night. Next weeks straddle px is $11.18. If we take this px and divide by the stock px of 165, we get an expected Implied Move of 6.7%(biased high).Currently ORATS has this at an implied move of 6.9% which is more expensive than the straddle px??
    They also have their 10day Ivol at 56% and there 10day IVOL without earnings at 41%. If we do a time series event vol calculation this is what we get.
    time.series.vol = function(v1, v2, dte, dte2){
      event.vol = dte2*(v2^2) - dte*(v1^2)
      event.vol = sqrt(event.vol)
    expected_move = function(jumpvol){
      jump = sqrt(1/252)*jumpvol*sqrt(2/pi)
    time.series.vol(.41, .56, 9, 10)
    [1] 1.274009
    [1] 0.06403379
    This one seems to be okay but in my opinion, still biased high. The real question arises when we take a look at an article written by matthew on LULU. Here is a link to the article (btw he has written some good stuff on his blog, I highly recommend you read some of them) .

    The article was written on Aug 23rd meaning 6 Bus days until earnings on Aug 31.
    The "New IV" you see, is the forecast implied volatility the day after (not including) the event date. Screen Shot 2019-03-01 at 6.18.24 PM.png

    Matt states the implied move is 10.93%

    Screen Shot 2019-03-01 at 6.18.24 PM.png Screen Shot 2019-03-01 at 6.20.47 PM.png

    I went back to Aug 23rd to find out what the straddle px was for the Aug31 expiration and it was $12.20 LULU closed at 136.20 meaning the straddle was only 8.9% of the stock price, WAY below 10.9%. Again if we use the Vols for before and after from Mats table this is what we get.
    ## 6 bus days to expiration of which 5 of them hold 47% volatility stated by Mats New IV and one of them hold the event vol
    #formulas were stated above
    time.series.vol(.47, .74, 5, 6)
    [1] 1.476855
    > expected_move(1.476)
    [1] 0.07418672
    The implied move is on 7% which is much more reasonable given the straddle was 8.9% and there was 6 bus days still to go.

    My thinking is that this "implied move" that is stated by orats is actually the implied standard deviation. Ie. Implied Move/.79.

    I hope some of you can chime in here. Thanks
  3. straddle px for LULU on Aug 23
    Screen Shot 2019-03-01 at 6.22.31 PM.png
  4. Code:
    [1] 8/30/2018
    Levels: 8/30/2018
    [1] 10.2747
    Screen Shot 2019-03-01 at 7.42.50 PM.png

    :( well there goes my excitement about orats. I am also finding some problems with their straddle px's. For example they have CRM 2nd month straddle price at 18.20 updated at 3pm.

    [1] 18.2
    > frame$data.updatedAt
    [1] 2019-03-01 15:02:26
    Levels: 2019-03-01 15:02:26
  5. Sorry here is the CRM straddle for 2nd month
    Screen Shot 2019-03-01 at 7.51.11 PM.png
  6. destriero


    Didn't I tell you that ORATS was the shit?
  7. Yes, I have only heard good things from the guys on this forum. However, if you look at my most recent post, I am finding some issues already with the data (most likely I am miss-interpreting the documentation). Any ideas here Des? Also any reason why you stopped using them?
  8. destriero


    I don't need historical data for what I'm doing.
  9. Do any of you understand the ErnEffct or ImpliedEe/FcstErnEffct from ORATS? From reading what Matt has written on the documentation/emails/blogs my understanding was, it is how much Implied Volatility in % was being added to the 1 year options due to the earnings event. When I plot the data it looks very strange. Also, vastly different companies like PEP and SIG have similar ErnEffects. ErnEffects look like they top out @ 4 and bottom @ 1:s. Lastly, impliedIee and impliedEe are exactly the same, yet have different documentation. The data has 5000 equities.

    ee.PNG impee.PNG

    For the bottom photo I am regressing the ORATS implied Mv% on the ImpliedErnEffct.


    The table below is historical ernEffct for the past earnings for PEP and AMZN.

    Edit** Here is a blog from ORATS where MATT explains it.
    Last edited: Apr 3, 2019
  10. Just to save anyone else the hassle. ORATS has THE best implied vol surface data I have seen/used. However, they are really lacking on the earnings side. They have countless errors in their Earnings Dates which throws off most of the variables. It's unfortunate, but I will not be renewing my Earnings Core sub and will stick with their Vol surface subscription on Quandl. There is definitely a need in the market for an in-depth earnings data provider.
    #10     Apr 4, 2019