The Arms Race in High Frequency Trading

Discussion in 'Professional Trading' started by ASusilovic, Apr 22, 2009.

  1. If the calls I am getting from headhunters are any indication, the hot area now is high frequency trading. And no wonder. There are two areas that were spared in the 2008 debacle: macro and high frequency trading. Macro funds on average were up ten percent or so last year because most of them skirted the edge of the major dislocations; their strategies focus on liquid instruments and are not oriented toward credit. High frequency trading did well because it thrives in an environment of high volatility and demand for liquidity, and 2008 was a hot house for both. Every year, people pile on to whatever strategy did well the previous year – this tendency is worth a book or two on its own – and so this year high frequency is destined to be the darling of the fund of funds.

    But I think the days for high frequency trading are numbered. For one thing, high frequency trading is capacity constrained like few other strategies. The high frequency trader is basically a stand-alone market maker; he is sitting there to provide liquidity to others. And one way he provides it is to pull in the positions that others will shortly be demanding – thus the need for speed. If the footprint for high frequency traders gets too large, they become liquidity demanders themselves, and the gig is up. The Renaissances of the strategy will make their way through, but generally we will see a lot of shooting stars.

    A second reason is that high frequency trading is embroiled in an arms race. And arms races are negative sum games. The arms in this case are not tanks and jets, but computer chips and throughput. But like any arms race, the result is a cycle of spending which leaves everyone in the same relative position, only poorer. Put another way, like any arms race, what is happening with high frequency trading is a net drain on social welfare.

    In terms of chips, I gave a talk at an Intel conference a few years ago, when they were launching their newest chip, dubbed the Tigerton. The various financial firms who had to be as fast as everyone else then shelled out an aggregate of hundreds of millions of dollar to upgrade, so that they could now execute trades in thirty milliseconds rather than forty milliseconds – or whatever, I really can’t remember, except that it is too fast for anyone to care were it not that other people were also doing it. And now there is a new chip, code named Nehalem. So another hundred million dollars all around, and latency will be dropped a few milliseconds more.

    In terms of throughput and latency, the standard tricks are to get your servers as close to the data source as possible, use really big lines, and break data into little bite-sized packets. I was speaking at Reuters last week, and they mentioned to me that they were breaking their news flows into optimized sixty byte packets for their arms race-oriented clients, because that was the fastest way through network. (Anything smaller gets queued by some network algorithms, so sixty bytes seems to be the magic number).

    If we get out of the forest and look at what is going on, some questions come to mind. Does anyone really get a benefit in having the latency of their trade cut by milliseconds – except for the fact that their competitor is also spending the money to cut his latency? Should anyone care if a news event hits market prices in twenty-nine milliseconds rather than thirty milliseconds? Does it do anything to make the markets more efficient? Does it add any value to society?

    We usually do not think about trading in terms of social value, but trading often does have social value, and it should. The objective of trading is to provide liquidity to the market, and to make sure that prices best reflect all available information – the usual efficient market argument we all grew up with. The solution? How about having everyone agree to standards in terms of hardware and related configurations. A high-frequency arms limitation treaty. We could call it HALT.
  2. Interesting. I have talked to a few firms myself lately and the ONLY thing anyone is interested in is High Frequency Trading - regardless of your personal returns or anything else. We are either on the road to where the market will only exist as a playground for the quants or at some inflection point in that life cycle - which I believe is the case. I have read that some quants got hit really hard in March and early April as the junk started really rallying and that they have been deleveraging the past few weeks - which makes a lot of sense given the action in the market.
  3. MGJ


    Managed futures also did quite well.
  4. nitro


    The way we look at these expenditures is, that often we make back the money in a few good trades. So it is not about expenditure, it is about break even point and odds of achieving it.

    As far as benefit to society, what benefit to society does any trading have? People say liquidity, but if the growth in volume is any indication, this is almost entirely due to automation.

    Finally, markets should move in step functions in response to news. The fact that they don't and have to go through intermediary stages means they are grossly inneficient. The faster you get in the liquidity pool, the more profit. The more people that try to do that, the more markets will respond to news in step functions, hence making them more efficient.


    PS By the way, hardware is nowhere near the highest cost in achieving HFT that allows taking advantage of the pipes, multi-core CPUs, etc. It is creative and expert programming.
  5. Nice post.
  6. yup... funny thing is, it's also the highest cost with inefficient and error-prone programming... either way, you pay.
  7. High frequency trading is only used when the broker is the one discretionally managing their accounts (so they generate boatloads of commissions)
  8. Euler


    I agree, nitro. Hardware and latency certainly deserve due consideration and expenditure, but obsessing over these alone (as the original blog author seems to imply that many are doing) will not keep one in the game long-term, I think. It's the entire system that counts, and as you say, other aspects are more important, in general.

    High freq trading is a very competitive business and only those that provide liquidity at the most competitive prices, yet still profit, will survive. Anyone who's in it and profitable today should assume that their edge could be gone in a matter of weeks, lost to even better systems. In the end, as you say, society benefits with more efficient markets.

  9. I would agree in most situations, but don't you think aggressive front running could mess this up? The scenario I am thinking of is a HFT takes liquidity in advance of say a price-insensitive mutual fund order, scares some more out of the book, then the mutual fund trade goes through at a worse price than it would have, and the price reverses later when the book replenishes. So in this case you don't have a step function, you get an over-shoot then a correction.

    I don't know how much mutual fund trading actually goes into the order book nowdays, but I think the effect would be similar if it gets crossed somewhere based on a price from the exchange. Besides, I am sure there is somebody worth some sympathy whose trading would get affected even if it is not a mutual fund.

    This would be bad for society - less liquidity and worse price discovery.

  10. rosy2


    I agree. IMO the hardware and latency hype you hear in the news is advertisements so firms buy the latest stuff. Latency and hardware are important but just because you have the fastest infrastructure around means nothing unless you have some kind of edge.

    It seems some firms are trying to create this ultra-fast system but have no strategies to put on it. And some strategies dont even need the extra speed.
    #10     Apr 23, 2009