I presume tick volume means the number of times the price changes during the period? For example, I just downloaded some historic EURUSD data from IG Index and can see they have "volume" available, it is higher around midday-mid afternoon UK time:
~3 years of tick data from histdata.com for the majors and crosses is ~ 7 GB of data compressed. Not bad. Code to cache is here: Code: def _getCachedCSV(self,instrument:orm.Instrument,year,month): fname = f'HISTDATA_COM_ASCII_{instrument.code.upper()}_T_{year}{month:02d}.zip' cachedir = '__histdata_cache' local = f'{cachedir}/{fname}' print(local) if not os.path.isfile(local): url = f"https://www.histdata.com/download-free-forex-historical-data/?/ascii/tick-data-quotes/{instrument.code.lower()}/{year}/{month}" page = requests.get(url) dom:BeautifulSoup = BeautifulSoup(page.text) if not os.path.isdir(cachedir): os.makedirs(cachedir) form = dom.find(id="file_down") payload = dict() for input in form.find_all("input"): payload[input['name']] = input['value'] formurl = f"https://www.histdata.com{form['action']}" response = requests.post(formurl,data=payload,allow_redirects=True,headers={'referer':url}) with open(local,'wb') as dst: shutil.copyfileobj(io.BytesIO(response.content),dst) zip = zipfile.ZipFile(file=open(local,'rb')) return zip.open(zip.filelist[0].filename)
OK so the way fastmatch works is that they have people who voluntarily contribute their volume to the network. This is not the entire volume obviously, but appears that it would be beneficial since only the whales would really submit their volume willingly. Tricky thing is though... If I were making a big move, I know I wouldn't telegraph it voluntarily. So how useful can it be? @abnormal do you have sample data we can look at?
I’d ask myself if I even needed volume. Personally I’ve never tested a strategy where volume added any benefit besides determining if a instrument was liquid enough. Forex volume sounds impossible to track seeing theirs no central exchange.