Dear- Instead of using IB API (under Python or C# ???) , did anyone have experience to use SELENIUM to 1) download current holding equities, from IB webpage to csv file in local PC 2) upload order delivery, from local PC to IB server. ************************************************ Please note that free selenium is introduced ONLY 5 years (?) ago. Therefore most have no experience yet. http://www.seleniumhq.org/ I heard C# webcrawling by selenium is NOT as fast as API connection to IB server. However, unlike the future/option, most equity traders does NOT need fast connection between local PC and IB server. Any comment is welcomed. PS) some keyword in selenium is "DOM" and "html source" PS2) It is worth to spend time to develope web trading automatically, since probably ALL brokers provide website and therefore we can use every broker under ONE system. Do NOT feel fearful to change to other broker, for the reason of different API. PS3) crawling=scrapping ???
You already asked the same question on another thread. Only difference is this time you mentioned a specific product. But it doesn't change the underlying problem. So guess what, this is still a bad idea. This is your key misconception: since probably ALL brokers provide website and therefore we can use every broker under ONE system. Do not fear to change the other broker No, because ALL brokers websites are DIFFERENT. And they CHANGE all the TIME. And it doesn't matter which product you use to do the interface with the website, this problem won't go away. Let me tell you a story which will help you understand (hopefully). At the weekend I met an old work colleague who is now working for a professional gambling outfit (same job- automated trading - different markets - football goal spreads not eurodollar futures). Most bookmakers don't have API's. So they have a massive team of people building scrapers and order processors to interface with all their different websites (out of a 100 person operation maybe a quarter doing this exciting job). Of these 25 or so people about 1% of their time is spent working on bookmakers with API's (to deal with occasional changes and to make improvements). The rest is spent on dealing with the non API brokers they have to use websites for. So every few months the bookmakers make a small change or a large change to the website - data format or order entry. And they have to hope the change is large enough that it doesn't cause a subtle error, but instead breaks their system entirely so at least they know about it. Because the bookie doesn't tell them they're doing it (these guys hate the professional gamblers). And then they have to spend days or weeks rewriting their stuff, during which they can't do any trading with that bookie. Because they are doing arbitrage in a world where most bookmakers don't have API's they have to do this (arguably the arbitrage benefits arise because it's such a pain in the ass to do this, and most people do it manually which is slower). But you don't have to do this. Use the IB API, or another broker with an API. Two facts: - Dealing with multiple API's is going to be easier than dealing with multiple websites. - Dealing with stable, backward compatible changes to API's is going to be easier than dealing with arbitrary changes to websites. But then I said all this before and you didn't listen. Seriously if you'd invested the amount of time in learning the API of a broker as you've done in researching this dead end, you'd be an expert by now. GAT . GAT
Thanks for kind reply. 1) Since webcrawling is pretty new technology and most do not know it yet, there might be some (non-free?) tool proposed, to handle (scrap) many broker in ONE SYSTEM. I have tried to hire IB API man before, but he asked too much. 2) As a hypothesis, let us suppose IB charge $2 for one BUY/SELL and you pay $100 everyday. It easily adds up to 2K per month and 24K per years. If your trading logic is compoundedly profitable, then you expect to pay roughly 1000K at least for the next 50 years. Also suppose there is another broker with NO commission, such as Robinhood, but it does NOT provide API so that web trading is an (ONLY) alternative. Which one would you choose between IB and Robinhood? Tell me your choice for 1000K OR web scrap.
Why would you not want to webscrape? Because the API is specifically designed to provide access. If done correctly, the way you communicate with it is independent of the changes that IB makes on their end. I shudder to think how often a webpage changes (even if it looks the same).
Every choice has its own disadvantage. If some broker intentionally change web page, then I do NOT use it any more. Except the possible webpage change, would you accept advantage of web-crawling, such as Robinhood with NO API provided(?)? In fact, there are lots of cheap brokers with NO API around the world. PS) I forgot to say my situation. There are many brokers around the world and I have to deal with many brokers in many different countries. Therefore I should NOT addicted to a particular broker, such as IB API. Versatile or robust adaptation to 10 brokers is my primary concern now.
If I heard correctly, html is mid 90's and back then not much thing to do regarding crawling as much as today. However, current HTML5 may prosper another 10 more years, as global standard. If so, web-trading might be the safe choice. My understanding might be wrong partly, however I won't spend time to study each APIs for 10 different brokers. Furthermore, if some company offers robust connection to 10 different brokers with API or crawling, then I can gladly pay my money to the good company.
"Every choice has its own disadvantage. If some broker intentionally change web page, then I do NOT use it any more." Then you'll use each broker once, for about 6 months maximum. "Except the possible webpage change, would you accept advantage of web-crawling, such as Robinhood with NO API provided(?)? No. The up front work will be much larger with web crawling. (And also, personally, I'd rather trade with IB than robinhood. The latter may be fine for retail investors, but not for trading) I think you misunderstand how web crawling works, and what it is capable of. Web crawling identifies simple text buried inside pages. It often needs help doing this (robots.txt). It won't be able to automatically identify say price data consistently and reliably enough for the purposes of trading. It certainly won't be able to automatically identify how to submit an order. You'll have to manually work out how all this is done, and write a script that will do it. This is no different from having to write an API call, except there is no documentation on how you'd do it, and you'd have to work it out through trial and error and picking through .html and .xml files. You have to deal with security measures deliberately designed to frustrate computers like CAPTCHA (actually off hand I can't think how you'd deal with this, except for logging in once manually and doing stuff to keep the session active before hitting auto log off time limits). This will take a long time. Much longer than dealing with an API. "have to deal with many brokers in many different countries." You already said this on the other thread. You never explained why you had to do this, apart from some vague allusion to tax I used to work for a large, multi billion dollar, hedge fund. We traded hundreds of instruments across dozens of markets across the world. We had perhaps two hundred funds, spread across multiple jurisdictions, many of which operated offshore. I want you to guess from how many different countries the brokers we were dealing with came from? The answer is two. And actually for many years we made do with entirely London based brokers, until we opened a HK office (and actually until we did that we dealt in the same markets perfectly happily from London). Yes many of those brokers were US based or from other countries, but it was their London entity we were dealing with. There is absolutely no reason why anyone should need to use brokers in different countries. "Versatile or robust adaptation to 10 brokers is my primary concern now." No, your primary concern should be getting a system working with one broker. I don't understand your aversion to using API's. If your going to run a system which requires automated trading, you should eithier be happy to use something like quantopian which does it for you, or have the programming ability to read an API manual and / or my blog which does most of the work for you. But please stop pursuing this lunatic idea. You've wasted your own time, and because I'm an idiot who is trying to help you, you've wasted my time as well. GAT
You may have better idea for crawling. I like to learn an alternative method, next to API. At least, even if I have no complaint on McDonalds, then still better to try BurgerKing and Wendy's as a second choice. For your comment "This will take a long time. Much longer than dealing with an API.", I already spent more than a year, to post what I want with IB API. Unfortunately, no official help was given to me and someone with IB API background asked MORE THAN 10K which is rejected by me. For 10 more brokers in the future, I like to try web crawling as an alternative, since I can't pay 10 times of labour, for API help.