Home > Tools of the Trade > Data Sets and Feeds > Looks like Yahoo Download API is now Dead - 11/1/2017

Looks like Yahoo Download API is now Dead - 11/1/2017

  1. Since Yahoo terminated their API back in the May/June time frame, I was still able to get web-based data to download into EXCEL from a query such as:

    https://download.finance.yahoo.com/d/quotes.csv?s=SPY+DIA&f=sl1d1t1c1ohgv&e=.csv

    The above web query would yield:

    spy 257.15 10/31/2017 4:00pm 0.4 257.18 257.44 256.8 60304781
    dia 233.63 10/31/2017 4:00pm 0.3 233.64 233.91 233.2 1772047

    Well, it seems that as of sometime today, this has become disabled as well. The above web-based query yields a:

    "Sorry, Unable to process request at this time -- error 999."

    So, assuming that this web service is dead in the water for good, can someone offer an alternative source (free or paid) that would yield the same, via a web-interface (or equivalent)?

    Thanks in advance.
     
  2. I was wondering if something was wrong with my PC. Now, it is confirmed the problem lies with Yahoo server.
     
  3. Demand a refund!
     
  4. I bet Verizon feels that way.
     
  5. Haha. You're being unreasonable. The data was free to start with. However, what Yahoo did wrong was to pull the plug immediately. At least give some warning so that the customers can look for alternatives before this nasty surprise. Now, everyone is left vulnerable and fighting in the dark against the market. bad bad Yahoo.
     
  6. You get what you paid for.
     
  7. I agree but for goodwill reasons even to non-paying customers, don't pull the rug under our feet without warning. Nobody likes that.
     
  8. Seriously? I'll repeat it. You get what you paid for. No business is "in it for the feel-good goodwill of the thing." If they were, they'd be called a non-profit charity, which is also in the business of making money.
     
  9. This is a blessing in disguise. Retail traders who rely on "free" Yahoo data are typically undercapitalized (else they'd subscribe to a real datafeed) and if this discontinuation keeps them out of the market, they'll be the richer for it in the long run.
     
  10. There was a similar thread on here a few months back. Look for it. Maybe Google has free data.
     
  11. I hear what you are saying but try to not be so negative. It is possible to "beat the machines", I even hear that there is a nice book out there on how to do it.
     
  12. Appreciate all the philosophy on this thread, but hear no recommendations on alternate sources of data (either free or paid) that could replace my original post.

    And shame on Yahoo! for not having the decency to at least post a message indicating that the service is shut down, as opposed to a cryptic error number.
     
  13. Anyone know if there is an undocumented GOOGLE equivalent to the following, that generates a CSV for multiple tickers in one shot:

    https://download.finance.yahoo.com/d/quotes.csv?s=SPY+DIA&f=sl1d1t1c1ohgv&e=.csv
     
  14. Everyone on ET squawks about phone-home risk with trading software -- "they might be trying to get my IP!!" -- but wants to get IP for nothing from Yahoo. Go figure.
     
  15. So, I downloaded AmiQuote to test, and can download from Google EOD (Yahoo broken obviously), and there appears to be no way to generate a single CSV with EOD data from a list of tickers. Anyone know if the ability to do this magically appears upon license purchase?
     
  16. You get multiple csv's? One ticker per file?
     
  17. In the trial I downloaded, I specified 2 tickers, and run the download from Google. I need to download on each respective ticker to get the associated CSV for it, but would like a single consolidated CSV with data from each ticker in it (that is, one row per ticker). Not sure if the unavailability of this feature is due to it being Unregistered. I just emailed their customer support, with that an a few other questions. I'll post back here, when I hear from them.
     
  18. Stock quotes shouldn't be "IP" and the recent crackdown from NASDAQ exchanges/owners smacks of market manipulation and unfair cover-up of public information.
     
  19. From Yahoo:

    [​IMG]

    [​IMG] Nixon

    Administratin' 'Hoo

    32m ago

    Re: Is Yahoo! Finance API broken?
    Hi All - I'm blocking new replies to this thread since it refers to an older outage.



    The new download issue which began 11/01/17 and returning an error 999 for most users is currently being investigated and we hope to have it resolved soon.
     
  20. Call your Congressman! Call the SEC! Call the US Attorney! Demand an investigation!
     
  21. -----
    @hayman - on the other hand admin = nixon keeps writing: "UPDATE: It has come to our attention that this service is being used in violation of the Yahoo Terms of Service. As such, the service is being discontinued. For all future markets and equities data research, please refer to finance.yahoo.com Thank you. "
     
  22. There is already a thread that I posted in that covers this subject. I haven't looked at trying to scrape intraday data from yahoo but I can tell you that it is possible to data scrape daily OHLC data directly from the HTML from yahoo. You will need to do a bit of work to harvest this data but it is possible.
     
  23. You can also cut your lawn with a pair of scissors...
     
  24. LOL, if the scissors are automated and can run themselves automatically then why not? Scissors have less of a carbon footprint too so it is good for the environment. Please continue to share your valuable knowledge here my friend.
     
  25. I did see those posts, and I believe (and I might be wrong), that those users were directly using Yahoo! data on their websites, which is a breach of terms of service. Then again, the wording of "the service is being discontinued" is worrisome. Here's hoping that this Yahoo Admin just doesn't have a command of the English language.
     
  26. I've scraped in my youth, and although a fun programming exercise, the problem is that when websites change format (and Yahoo does this often), your scrape code usually needs to change along with it. I gave up scraping a long time ago, because it just wasn't worth the maintenance headache.
     
  27. Oh boy, I just tried to run a simple download query:

    http://download.finance.yahoo.com/d/quotes.csv?s=xrx+vz+frp+bls&f=sl1d1t1c1ohgv&e=.csv

    and now I get the message,

    "It has come to our attention that this service is being used in violation of the Yahoo Terms of Service. As such, the service is being discontinued. For all future markets and equities data research, please refer to finance.yahoo.com."
     
  28. The data from yahoo is JSON so it is basically structured data within unstructured data.

    If you scrape based on exact delimiters and their location then you will constantly need to rewrite your code like you said. If you treat the HTML like one big "string" then there is usually a keyword marker that can act as your "start of data". This is a way to keep your code more nimble.

    I don't deny that this is a headache and a possible waste of time. We are not on a "time management" message board. If you want to spend your free time doing it then yes it is possible regardless of what api services yahoo shuts down.
     
  29. Scraping Yahoo looks like a mess to me. Just tried. Then I tried google and this is the string you would look for: ["SB","Safe Bulkers, Inc.","3.42"

    ["SB" - is only there once - an unique entry. So, you would look for ["your symbol" Good Luck !
     
  30. Anyways, to answer this question from the OP of the OP...

    Get a broker with an included data feed. Or just pay a small fee for a dedicated data feed.

    End of line.
     
  31. Yup, just a pain in the ass. I only need about 50 tickers (stocks and ETF's) on an EOD basis, but this drives my portfolio, a couple of Wealthlab trading scripts (which are now dead in the water, since it's WL 4.0 - unless I convert from native WL to ASCII upload), and my semi-annual 401K re-balancing. Work that I wasn't planning on doing, but I guess this day of reckoning was about to come sooner or later.

    How much did Verizon (over)pay for this company?
     
  32. Just use Deriscope. It is 100% free and gets live prices with only 1 minute delay into Excel
     
  33. Why not put the quotes into a portfolio and download using the export link?
     
  34. Sometimes the simplest ideas are the best......great idea. I was going to build a DDE-based spreadsheet to use with my IB feed, but I like your idea; thanks for that! Didn't know that their portfolio feature was exportable like that. This solves a couple of my problems right away.

    Unfortunately, my Wealthlab issue is much larger, but not as urgent.

    Thanks again!
     
  35. Can't you get that from Quandl?
     
  36. You found me out
     
  37. Yes, but a little pricey for my needs. Thanks.
     
  38. Scraping is slow and wasteful. But well written code for scraping can handle some changes in code.

    I hope you realize that if you only update the latest day, the past data won't be split or dividend adjusted.

    Google's data is full of missing days and it's split but not dividend adjusted.
     
  39. Quandl free does not have ETF's. Try https://www.tiingo.com/
     
  40. I used Yahoo finance to pull certain data from them (stock price, dividend rate, dividend per share, and dividend pay date)
    I did this by using some code in VB in Excel that would allow me to update the spreadsheet automatically anytime I hit update all.

    Is there another site I can pull this same data into my excel spreadsheet?
    Preferably something that is free.

    Do you guys have any recommendations?
     
  41. I have no recommendations for on-demand web requests for live or delayed quotes to replace what Yahoo or others do.

    I use NxCore for my live intraday feeds and have a second feed of Norgate data for daily timeframe data.

    I supplement my systems with cross-checks from a number of sources. Why? Many events cause big changes in the prices of stocks. If those events are just due to normal capital changes/dividends to a stock, I need to know that. For example, I need to know if there is a split or dividend that means I need to adjust my stops.

    The quality of Yahoo and Tiingo is not good enough... Too many errors in their systems that generate false positive events that take too much time to validate (mostly invalidate).
     
  42. No problem, it can even be automated if you really want, but that involves working with the command line and a download utility called 'curl', it's a bit fiddly.

    With freebie data, it wouldn't be a bad thing to have two different sources worked out, then if one changes or disappears it doesn't leave you in the lurch right away.
     
  43. Actually curl works with alot of programming languages. No command line required . I use it with c++. The library is libcurl. There are other links that yahoo has that gives pure json data. I found one that requires no scraping or cookies. I have been wanting to post the link that I found and use but I worry if to many know yahoo make take that away also.
     
  44. Yeah, have a backup source is good. I will probably build a backup using my IB or TD Ameritrade feeds.

    FYI, there's an old command line utility called WebFetch.exe, which you can probably hunt around for, which could assist in the automation within a command line script.
     
  45. There is a fix for pandas data reader that scrapes the data from yahoo finance for you and stores it in a data frame. I've used it to acquire some data and it seems to work fine. Please find the link below.

    https://pypi.python.org/pypi/fix-yahoo-finance
     
  46. To use curl to download the export file it needs three things:

    1. A copy of the cookies that Yahoo uses. Best way is to install a fresh copy of Firefox Portable and add the Export Cookies add-on, then log into Yahoo. Then right click on the title bar, select Menu Bar then select Tools/Export Cookies from the menu bar and save as say cookies.txt

    2. The url that downloads the export file. In Firefox it can be copied with a right click on the 'Export' link and selecting Copy Link Location

    3. Installing curl and running it on the command line. Grab a copy from https://curl.haxx.se/, install, copy the cookies file into the same directory, open a command line and change to the the directory. Then run something like:

    curl "the-magic-url" --cookie cookies.txt --output yahooquotes.csv

    Hopefully that should do it! :)

    I guess Yahoo don't want to allow anonymous download as it makes it more open to abuse and denial of service attacks, even hacking.
     
  47. I've just produced a python script that can handle multi ticker yahoo data acquisition and thought I'd share incase this can help anyone. **The script requires the following packages which can be attained through pip: pandas / pandas_datareader / fix_yahoo_finance **

    The script saves each tickers data in a csv file in a folder named YahooData which is created in the same directory you place the python file. I have currently configured it to attain yahoo data for all S&P 500 companies but this can easily be modified to your own requirements by changing the list in line 42 such that it contains the tickers you require and finally comment out line 43.

    To run the script simply type "python AttainYahooData.py" in your command line ensuring that you are in the directory the python script is located.

    I've multi-threaded the process which speeds things up significantly.

    The data acquisition isn't always successful first time round so to retry type y when prompted by the command line. You may need to retry three or four times if you're downloading hundreds of tickers.

    The python script can be accessed on my google drive here:



    I have just modified the script so it will repeatedly try to fetch the tickers, without the need for user input, until tickers have been downloaded successfully. This will be better suited to automated systems based on yahoo data. Make sure you type the tickers correctly or else you will be stuck in a never ending loop :) (this version is named AttainYahooData_v1.py therefore use command "python AttainYahooData_v1.py" to run.

     
  48. and such programs:
    YLoader
    HSQuote / HSQuote Plus
    it is not sufficient?
    The creator of this HSQ is very polite and always helps with problems.
    try.
     
  49. Would you consider to create a modified script that creates a single csv file with one ticker data per line? Of course, if it is too much trouble, forget about it. I would do that myself but I don't have the skill needed.

    Thanks.
     
  50. I can take a look into this when I have some spare time but could you please help me in understanding exactly what you require. Am I correct in thinking you want the script to fetch the most recent data available for a number of defined tickers and then compile it in a csv in the format below?

    ticker, open, low, high, close, adj close, volume
    A, xxx, xxx, xxx, xxx, xxx, xxx
    B, xxx, xxx, xxx, xxx, xxx, xxx

    If not could you give an example of what it is you want.
     
  51. I would definitely be interested in something like that.
    Ideally I would love to be able to pull Price, Dividend, and Upcoming Payment Date

    If this is something I could pull from a web query into excel automatically that would be great.
    I'm currently looking at https://iextrading.com/developer/docs/#iex-api to try and do this, but my knowledge is limited. I was using VBA code in excel to pull from Yahoo.
     
  52. Thank for your attention and for your answer.

    In the code box below you can see the actual content of the csv file I used to download from Yahoo every few minutes with Free Download Manager.

    Code:
    "ETFMIB.MI",22.504,"11/1/2017","5:35pm",+0.181,22.400,22.576,22.400,585835
    "DJSC.MI",35.835,"11/1/2017","3:31pm",+0.195,35.910,35.930,35.850,520
    "CASAMO.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "IDVY.MI",23.705,"11/1/2017","5:11pm",+0.095,23.625,23.775,23.625,12711
    "VAL.MI",125.94,"11/1/2017","2:15pm",+0.80,125.55,126.00,125.55,518
    "GWT.MI",130.26,"11/1/2017","2:15pm",+0.60,130.51,130.61,130.31,315
    "DAXX.MI",128.66,"11/1/2017","5:27pm",+1.79,127.84,128.65,127.41,34393
    "^STOXX",396.77,"11/1/2017","5:50pm",+1.55,395.91,398.05,395.91,0
    "AUTP.MI",73.06,"11/1/2017","5:11pm",+1.27,72.63,73.10,72.63,284
    "BRES.MI",54.60,"11/1/2017","5:20pm",+1.37,54.14,54.99,54.14,13883
    "CHM.MI",111.21,"10/31/2017","5:11pm",+1.49,109.40,109.40,109.40,0
    "FOO.MI",76.32,"11/1/2017","5:21pm",-0.09,76.20,76.60,76.20,3183
    "OIL.MI",41.73,"11/1/2017","5:21pm",+0.40,41.48,41.91,41.48,1063
    "PHG.MI",100.23,"11/1/2017","5:19pm",-0.19,100.78,100.79,100.21,218
    "TELE.MI",37.445,"11/1/2017","4:59pm",+0.240,37.350,37.530,37.290,27490
    "UTI.MI",41.18,"11/1/2017","3:01pm",+0.04,41.30,41.32,41.28,473
    "INRG.MI",4.6875,"11/1/2017","4:16pm",+0.0225,4.6950,4.6950,4.6700,4266
    "IUSA.MI",22.1225,"11/1/2017","5:16pm",+0.0925,22.0700,22.1675,22.0700,35953
    "ITKY.MI",23.765,"11/1/2017","5:26pm",+0.525,23.300,23.840,23.300,603
    "TUR.MI",45.26,"11/1/2017","5:29pm",+1.05,44.29,45.50,44.29,10776
    "CINA.MI",145.50,"11/1/2017","5:20pm",+1.69,145.00,145.72,145.00,2123
    "INDI.MI",16.765,"11/1/2017","5:19pm",+0.265,16.540,16.790,16.540,35262
    "XMIN.MI",13.26,"11/1/2017","2:57pm",+0.07,13.16,13.31,13.16,10466
    "JPN.MI",131.33,"11/1/2017","5:20pm",+1.15,130.50,131.43,130.50,59908
    "IBZL.MI",27.355,"11/1/2017","5:28pm",-0.030,27.525,27.715,27.310,17607
    "BRA.MI",17.49,"11/1/2017","5:18pm",+0.05,17.62,17.67,17.49,7918
    "XWCD.MI",20.925,"4/19/2016","2:59pm",+0.055,20.910,20.910,20.910,450
    "XWSD.MI",26.545,"4/14/2016","3:54pm",+0.095,26.665,26.665,26.550,371
    "SAFRI.MI",35.29,"11/1/2017","5:21pm",+0.29,35.09,35.38,35.09,155
    "COFF.MI",0.9835,"11/1/2017","5:17pm",-0.0195,1.0130,1.0160,0.9780,1192134
    "SOYB.MI",18.235,"11/1/2017","4:47pm",+0.180,18.160,18.235,18.160,213
    "PHAU.MI",105.50,"11/1/2017","5:26pm",+0.85,105.07,105.62,105.07,18688
    "OIL3L.MI",0.252,"11/1/2017","5:35pm",+0.002,0.251,0.263,0.249,5536247
    "NEX.PA",56.36,"11/1/2017","5:35pm",+0.16,56.20,57.07,56.01,212169
    "UG.PA",20.805,"11/1/2017","5:35pm",+0.440,20.500,20.860,20.470,1958008
    "VOW.DE",167.85,"11/1/2017","5:35pm",+8.45,160.65,168.20,160.40,192917
    "BAN.MI",3.524,"11/1/2017","5:35pm",+0.004,3.530,3.608,3.524,23748
    "ISP.MI",2.878,"11/1/2017","5:36pm",-0.008,2.894,2.902,2.874,67290043
    "ISPR.MI",2.706,"11/1/2017","5:35pm",-0.002,2.714,2.718,2.698,2124166
    "PRY.MI",30.00,"11/1/2017","5:35pm",+0.40,29.75,30.08,29.67,718499
    "333129.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "333130.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "X0729K.TI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "347024.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "148058.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "B3ET.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "EU3L.MI",14.575,"11/1/2017","5:35pm",+0.170,14.505,14.665,14.505,8683
    "XSX6.MI",81.13,"11/1/2017","2:16pm",+0.27,81.13,81.33,81.12,6457
    "LEVMIB.MI",8.92,"11/1/2017","5:29pm",+0.15,8.80,8.97,8.80,467896
    "82EA.L",N/A,N/A,N/A,+0.00,N/A,N/A,N/A,0
    "X32348.TI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "475499.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "I78137.mi",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "T1NV11.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "TLG13.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "764806.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "PIST14.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "PIST19.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "PIST21.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "PIST26.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "PIST35.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "POE00K.TI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "XS0935881853.TI",94.60,"11/1/2017","5:07pm",-0.17,N/A,94.81,94.58,70000
    "XS0995130712.TI",95.87,"11/1/2017","1:26pm",-0.26,N/A,96.14,95.55,198000
    "XS1139474206.TI",92.59,"11/1/2017","4:08pm",-0.08,N/A,92.66,92.58,90000
    "XS1167524922.TI",97.40,"11/1/2017","5:15pm",+0.04,N/A,97.42,97.18,1430000
    "764806.MI",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "FR0010135103.PA",655.6900,"10/30/2017","9:00pm",-0.8000,N/A,N/A,N/A,0
    "IE00B1Z6D669.IR",15.4100,"10/27/2017","8:00pm",+0.0100,N/A,N/A,N/A,0
    "IE00B11XZB05.IR",19.7100,"10/30/2017","8:00pm",+0.0000,N/A,N/A,N/A,0
    "AH8M.MU",118.30,"11/1/2017","6:47pm",+0.11,118.44,118.44,118.30,0
    "FJRZ.MU",23.18,"11/1/2017","7:55pm",+0.32,23.11,23.24,23.11,30
    "13716156.SW",N/A,N/A,N/A,N/A,N/A,N/A,N/A,N/A
    "EURGBP=X",0.8774,"11/1/2017","7:58pm",+0.0006,0.8767,0.8784,0.8732,0
    "EURAUD=X",1.5150,"11/1/2017","7:58pm",-0.0063,1.5213,1.5224,1.5115,0
    "EURNZD=X",1.6883,"11/1/2017","7:58pm",-0.0033,1.6916,1.6919,1.6792,0
    "EURZAR=X",16.3509,"11/1/2017","7:58pm",-0.0968,16.4477,16.4943,16.3088,0
    "EURTRY=X",4.4381,"11/1/2017","7:58pm",+0.0220,4.4148,4.4564,4.4069,0
    "EURUSD=X",1.1623,"11/1/2017","7:58pm",-0.0025,1.1647,1.1657,1.1605,0
    "USDJPY=X",114.1230,"11/1/2017","7:58pm",+0.4730,113.6580,114.2820,113.5800,0
    "XWD.TO",48.135,"11/1/2017","3:41pm",+0.005,48.300,48.350,48.080,42001
    "DISW.MI",245.40,"11/1/2017","11:30am",+1.25,244.99,245.99,244.99,21
    "STAW.MI",274.18,"11/1/2017","5:21pm",+1.16,274.39,274.71,274.39,11
    
    This was the URL that created the csv from Yahoo:
    Code:
    http://download.finance.yahoo.com/d/quotes.csv?s=ETFMIB.MI,DJSC.MI,CASAMO.MI,IDVY.MI,VAL.MI,GWT.MI,DAXX.MI,%5ESTOXX,AUTP.MI,BRES.MI,CHM.MI,FOO.MI,OIL.MI,PHG.MI,TELE.MI,UTI.MI,INRG.MI,IUSA.MI,ITKY.MI,TUR.MI,CINA.MI,INDI.MI,XMIN.MI,JPN.MI,IBZL.MI,BRA.MI,XWCD.MI,XWSD.MI,SAFRI.MI,COFF.MI,SOYB.MI,PHAU.MI,OIL3L.MI,NEX.PA,UG.PA,VOW.DE,BAN.MI,ISP.MI,ISPR.MI,PRY.MI,333129.MI,333130.MI,X0729K.TI,347024.MI,148058.MI,B3ET.MI,EU3L.MI,XSX6.MI,LEVMIB.MI,82EA.L,X32348.TI,475499.MI,I78137.mi,T1NV11.MI,TLG13.MI,764806.MI,PIST14.MI,PIST19.MI,PIST21.MI,PIST26.MI,PIST35.MI,POE00K.TI,XS0935881853.TI,XS0995130712.TI,XS1139474206.TI,XS1167524922.TI,764806.MI,FR0010135103.PA,IE00B1Z6D669.IR,IE00B11XZB05.IR,AH8M.MU,FJRZ.MU,13716156.SW,EURGBP=X,EURAUD=X,EURNZD=X,EURZAR=X,EURTRY=X,EURUSD=X,USDJPY=X,XWD.TO,DISW.MI,STAW.MI,&f=sl1d1t1c1ohgv&e=.csv
    Then I have an auto updating query on external data in an Excel file that will get the data from the csv so every five minuts I can see the values updated.

    Now that Yahoo feature is dead, being able to recreate a similar csv in some other way would be awesome.

    Thanks.

    fabio
     
  53. To get the data into Excel I first created a "Query on external data" manually importing from the csv into a sheet named "tests2", then I went and see the query properties, gave it a name (in my case "PF13") and saved it and then I created a button on a sheet ("WEB" for me) that triggers this VBA sub routine:
    Code:
    Private Sub UpdateButton_Click()
    
    Application.ScreenUpdating = False
    Sheets("tests2").Activate
        With Sheets("tests2")
             .Select
              .QueryTables("PF13").Refresh BackgroundQuery:=True
        End With
        Sheets("WEB").Activate
    Application.ScreenUpdating = True
    
    End Sub
    Apart Yahoo errors every now and then, it's been working like a charm.

    Thanks.

    f.
     
  54. Thanks a lot for discussions in this thread. Although none of the solutions solved the problem I have, it gives me valuable clues to finally allow me continue automatically download from yahoo financial site a list of tickers in a format that can be easily converted into csv format. For those who know how to program, it is easy. Otherwise, please do not bother to read the following. Here is what I do:
    1. use your browser either google chrome or firefox;
    2. create a yahoo account, and log in, and create a portfolio. You can add upto 1000 tickers into each portfolio.
    3. create a view to customize the display for the portfolio. for example, you can select: ticker, open, high, low, last, vol, time.
    3. you can manually download the web page into a single html file by selection "Save as: Web Page, Complete". (Wait this is not automatic yet. I will tell how to make it automatic if you continue to read). In the browsers's address bar, you could see the URL such as - https://finance.yahoo.com/portfolio/<your-fortfolio-id>/view/view_<your-view-id>, copy it to be used for later automatic download.
    4. After you get the html file, use the program "html2text" (free available for Linux/Unix platform) or use lynx -dump <the html file>. You get the text format of the quote table. From there, it is easy to convert the table into a csv format or anything you like if you know how to write a script (python, or anything you like).

    To make the above download automatic, that is, write a simple program (robot) to do downloading by simulating the clicks and keyboard types. Use the free tool automatic-save-as (from github.com) and xvkbd.

    The problem I cannot use curl or wget or lynx is they do not support full web page downloading into a single web html file (or I still do not find a way to use them correctly) even I added cookies.txt to start programs.
     
  55. Hello:

    Why does not a hyperlink to the watchlist created in excel work?

    Thank you very much
     
  56. I am new to this forum. My problem is I wanted to use Yloader. I have download and install and all is fine. But I am unable to convert the ascii or csv files to metastock format. I am asking for help from this forum. Thank you and god bless.
     
  57. www.databull.com may still get data into metastock. If ascii will do then Amiquote http://www.amibroker.com/products.html has more free sources than just yahoo. Otherwise you will have to get free data from an api.
     
  58. It's still possible to get the end of day data from Yahoo finance.
    1. Timestamp has been changed to unix format. It's important to submit the right time for start and end period e.g. GMT 4pm. Do a historical download and analyze the unix timestamp in the url.
    2. The new API must include crumb code.
    3. Get request must be with cookie.
    Some helpful chap has posted an Excel VBA script on how to retrieve the crumb and cookie at StackOverflow. You can adapt it to your script.
     
  59. There is another way to get this data without a lot of effort: .
     
  60. Not work (Excel 2016)