Tuesday, August 28, 2012

A steady murmur about HFT

Weeks after the Knight Capital mini-crisis (for which, I think, the best current explanation is the Nanex theory), the ongoing discussion about High Frequency Trading continues, at some sort of background-noise level.

A few postcards from the din:

  • Shortly before Reuters shut his blog down, Felix Salmon posted this blog article, pointing to Larry Tabb's article at the Wall Street and Technology blog site: Fragmentation Is Redlining the Markets:
    Given the events of the past six months, the SEC should think hard about the market structure it has created, and do its utmost to rein it in. While the SEC can't stop computers from getting faster, there is no reason it can't reduce price and venue fragmentation, which should slow the market down, reduce message traffic and lower technology burdens.
  • A nice piece by Renee DiResta on the O'Reilly site: Wall Street’s robots are not out to get you
    But it’s nonsense to make the leap from one brokerage experiencing severe technical difficulties to claiming that automated market-making creates some sort of systemic risk. The way the market handled the Knight fiasco is how markets are supposed to function — stupidly priced orders came in, the market absorbed them, the U.S. Securities and Exchange Commission (SEC) and the exchanges adhered to their rules regarding which trades could be busted (ultimately letting most of the trades stand and resulting in a $440 million loss for Knight).
  • The DiResta article points to a recent New York Times article by Nathaniel Popper: On Wall Street, the Rising Cost of Faster Trades.
    “They’ve reached the point where the competition is measured in microseconds and there are essentially no benefits to the public at that level,” said Lawrence E. Harris, the former chief economist at the Securities and Exchange Commission, and now a professor at the University of Southern California.
  • And, at NPR's Planet Money, an article about Thomas Peterffy of Interactive Brokers: A Father Of High-Speed Trading Thinks We Should Slow Down
    Peterffy says automation has done some very good things for the world. It's made buying and selling stocks much much cheaper for everyone.

    But Peterffy thinks the race for speed is doing more harm than good now. "We are competing at milliseconds," he says. "And whether you can shave three milliseconds of an order, has absolutely no social value."

  • Lastly, on the Marginal Revolution site, an interesting article by Alex Tabarrok: HFT versus the sub-optimal Tick
    there is good evidence for Salmon’s hypothesis that firms care about nominal share price; In Tick Size, Share Prices, and Stock Splits (JSTOR, H/T OneEyedMan) James Angel makes the interesting point that even as the S&P and CPI soared between 1924 and 1994 the average nominal price of a stock was very flat near $32. Angel’s explanation is that firms use the IPO price and splits to keep a consistent relationship between tick size and share price because: "A large relative tick size also encourages dealers to make a market in a stock….a larger tick provides a higher minimum round-trip profit to a dealer who can buy at the bid and sell at the offer."
Lots of discussion, no easy solution, no obvious consensus: the debate continues.

By the way, Felix Salmon has resumed writing, at his personal website. I still wonder why Reuters shut their blogs down...

No comments:

Post a Comment