Guess, what, it's algo day again!
The New York Times headline shrieks: Flood of Errant Trades Is a Black Eye for Wall Street
Traders on Wednesday said that a rogue algorithm repeatedly bought and sold millions of shares of companies like RadioShack, Best Buy, Bank of America and American Airlines, sending trading volume surging. While the trading firm involved blamed a “technology issue,” the company and regulators were still trying to understand what went wrong.
Nanex has charts and graphs: What Really Happened, or How to Test Your New Market Making Software and Avoid Detection
the same firm may have been on both sides of each trade and were simply testing out new market making software. After all, the NYSE started their new Retail Liquidity Price Improvement Program the very same day. And many of the trade executions are inside the bid/ask; something very unusual for trades and quotes from NYSE. Also, virtually all these trades are for 100 shares - the minimum number required to be reported in the system. And the timing of the trades is very evenly spaced, too evenly spaced. And many examples show a buy and sell appearing at virtually the same time, so close that the second trade has the very next exchange sequence number.
The New York Times says that Knight Capital have confirmed that they were indeed implementing new trading software, and that for a period of about 45 minutes, they were losing 10 million dollars a minute: Knight Capital Says Trading Mishap Cost It $440 Million
The problem on Wednesday led the firm’s computers to rapidly buy and sell millions of shares in over a hundred stocks for about 45 minutes after the markets opened. Those trades pushed the value of many stocks up, and the company’s losses appear to have occurred when it had to sell the overvalued shares back into the market at a lower price.
The New York Stock Exchange apparently cancelled a number of invalid trades. It's still not clear to me what makes a trade invalid, and how the exchange decides to uphold or cancel a particular trade.
Felix Salmon says that part of the problem is that the traders don't understand the behavior of the code they write: When large-scale complex IT systems break.
The fact is that a lot of the stock-trading world, at this point, especially when it comes to high-frequency algobots, operates on a level which is simply beyond intuition. Pattern-detecting algos detect patterns that the human mind can’t see, and they learn from them, and they trade on them, and some of them work, and some of them don’t, and no one really has a clue why. What’s more, as we saw today, the degree of control that humans have over these algos is much more tenuous than the HFT shops would have you believe. Knight is as good as it gets, in the HFT space: if they can blow up this badly, anybody can.
This is where the story gets, as Harrison put it, weird. He explains: “When we got everything set up in New York, the trades were faster, just as we expected. We saved thirty-five milliseconds by moving everything east. All of that went exactly as we planned.”
“But all of a sudden, our trading costs were higher. We were paying more to buy shares, and we were receiving less when we sold. The trading speeds were faster, but the execution was inferior.
I sympathise with all of this. Every day, I work with complex, intricate, sophisticated algorithms whose precise behavior I don't understand. I don't know any short-cuts: you have to be the sort of person who enjoys working with systems like these; you have to be disciplined, patient, thorough, and meticulous; you have to use every tool available to you (design and code reviews, test harnesses, monitoring tools, diagnostic logs, simulators, assertions in the code); and, most importantly, you have to develop and sustain a team of people who all can work together, who all are interested in building software that works, and who all bring different talents and skills to the problem.
And even then, it takes time, money, sustained effort, and a fair bit of luck.