Opinion

Felix Salmon

When large-scale complex IT systems break

By Felix Salmon
August 1, 2012

It’s rogue algo day in the markets today, which sounds rather as though the plot to The Fear Index has just become real, especially since the firm at the center of it all is called The Dark Knight, or something like that. At heart, however, is something entirely unsurprising: weird things happen when you get deep into the weeds of high-frequency trading, a highly-complex system which breaks in entirely unpredictable ways.

In fact, it’s weirder than that: HFT doesn’t just break in unpredictable ways, but works in unpredictable ways, too. Barry Ritholtz has an excerpt from Frank Partnoy’s new book, Wait, all about an HFT shop in California called UNX:

By the end of 2007, UNX was at the top of the list. The Plexus Group rankings of the leading trading firms hadn’t even mentioned UNX a year earlier. Now UNX was at the top, in nearly every relevant category…

Harrison understood that geography was causing delay: even at the speed of light, it was taking UNX’s orders a relatively long time to move across the country.

He studied UNX’s transaction speeds and noticed that it took about sixty-five milliseconds from when trades entered UNX’s computers until they were completed in New York. About half of that time was coast-to-coast travel. Closer meant faster. And faster meant better. So Harrison packed up UNX’s computers, shipped them to New York, and then turned them back on.

This is where the story gets, as Harrison put it, weird. He explains: “When we got everything set up in New York, the trades were faster, just as we expected. We saved thirty-five milliseconds by moving everything east. All of that went exactly as we planned.”

“But all of a sudden, our trading costs were higher. We were paying more to buy shares, and we were receiving less when we sold. The trading speeds were faster, but the execution was inferior. It was one of the strangest things I’d ever seen. We spent a huge amount of time confirming the results, testing and testing, but they held across the board. No matter what we tried, faster was worse.”

“Finally, we gave up and decided to slow down our computers a little bit, just to see what would happen. We delayed their operation. And when we went back up to sixty-five milliseconds of trade time, we went back to the top of the charts. It was really bizarre.”

Partnoy has a theory about what’s going on here — something about “optimizing delay”. But that sounds to me more like ex-post rationalization than anything which makes much intuitive sense. The fact is that a lot of the stock-trading world, at this point, especially when it comes to high-frequency algobots, operates on a level which is simply beyond intuition. Pattern-detecting algos detect patterns that the human mind can’t see, and they learn from them, and they trade on them, and some of them work, and some of them don’t, and no one really has a clue why. What’s more, as we saw today, the degree of control that humans have over these algos is much more tenuous than the HFT shops would have you believe. Knight is as good as it gets, in the HFT space: if they can blow up this badly, anybody can.

I frankly find it very hard to believe that all this trading is creating real value, as opposed to simply creating ever-larger tail risk. Bid-offer spreads are low, and there’s a lot of liquidity available on a day-to-day basis, but it’s very hard to put a dollar value on that liquidity. Let’s say we implemented a financial-transactions tax, or moved to a stock market where there was a mini-auction for every stock once per second: I doubt that would cause measurable harm to investors (as opposed to traders). And it would surely make the stock market as a whole less brittle.

It’s worth recalling what Dave Cliff and Linda Northrop wrote last year:

The concerns expressed here about modern computer-based trading in the global financial markets are really just a detailed instance of a more general story: it seems likely, or at least plausible, that major advanced economies are becoming increasingly reliant on large-scale complex IT systems (LSCITS): the complexity of these LSCITS is increasing rapidly; their socio-economic criticality is also increasing rapidly; our ability to manage them, and to predict their failures before it is too late, may not be keeping up. That is, we may be becoming critically dependent on LSCITS that we simply do not understand and hence are simply not capable of managing.

Today’s actions, I think, demonstrate that we’ve already reached that point. The question is whether we have any desire to do anything about it. And for the time being, the answer seems to be that no, we don’t.

Comments
16 comments so far | RSS Comments RSS

Salmon wrote: “Today’s actions, I think, demonstrate that we’ve already reached that point.”

We reached that point a long time ago. Remember the flash crash?

Computers are much faster than any living creature, but they run software created by humans who often make mistakes. Failure is not only an option, it is a common occurrence.

And Salmon failed to take the next step.

Why are gambling sites in the USA prohibited from allowing people to bet with real money, while banksters are allowed to gamble with billions if not trillions?

Either allow all forms of Internet gambling or return to the days when trades needed to be handled by humans. I vote for the latter.

Posted by baroque-quest | Report as abusive
 

You might want to look at this interesting post on the Flash Crash as studied by the Lawrence Berkeley National Lab here: http://oecdinsights.org/2012/06/28/what- can-nascar-teach-nasdaq-about-avoiding-c rashes/ which also has links to a scholarly article on the same subject. I contributed a companion piece on economic models, which comments on the nature of algorithmic trading at http://oecdinsights.org/2012/06/27/going -with-the-flow-can-analog-simulations-ma ke-economics-an-experimental-science/
and the lead article in the series, celebrating Turing, is also of interest.

This all underscores the question as to the social value of such markets and their unstable hyper-liquidity.

Posted by JRHulls | Report as abusive
 

Distributed systems are almost impossible to analyze and can behave in chaotic fashions. What’s new?

I’m a little less worried about this than Felix, as I don’t see how this will bring down civilization. (They ought to have allowed the Flash Crash trades to stand.) Hang your hat on unconstrained algorithms and you deserve to lose your shirt from time to time.

Posted by TFF | Report as abusive
 

TFF wrote: “Distributed systems are almost impossible to analyze and can behave in chaotic fashions.”

Nonsense.

If Newegg, a favorite supplier for IT people, were to add servers in a new city to meet demand, prices would not rise or fall. The only effect would be that availability would be a little better and response times would be a little less. It is a simple concept known as load balancing.

Yes, the two examples are not completely comparable, but competent people are expected to test their systems before putting them online. If systems cannot be thoroughly tested due to the nature of the business, perhaps the business should not exist.

The problem with high-speed traders is that they use extremely complex algorithms that depend on a myriad of factors. The speed of the trade is paramount. The flash crash and this newest episode has proven that this industry is dangerous.

I agree with TFF regarding his ststement “Hang your hat on unconstrained algorithms and you deserve to lose your shirt from time to time.” All businesses like the one Salmon described should be shuttered and their equipment given to schools.

Posted by baroque-quest | Report as abusive
 

Felix, you say “Let’s say we implemented a financial-transactions tax, or moved to a stock market where there was a mini-auction for every stock once per second: I doubt that would cause measurable harm to investors (as opposed to traders). And it would surely make the stock market as a whole less brittle.” Do you have any concrete data to back up these feelings? Or is it just pure speculation?

Posted by timothydh | Report as abusive
 

“If systems cannot be thoroughly tested due to the nature of the business”

Doesn’t that apply in this case?

As for Newegg’s servers, their task is much simpler — distributing static information that is not time sensitive. It isn’t a big deal if it takes a couple minutes for an update to push through the network (and it might happen more quickly than that).

Might a market be described as a dynamic consensus? A much knottier problem, and one whose behavior depends on all of the participants. Might be able to program something robust if you had a single programming team working with ALL the HFT machines, but they are each doing their own thing (and reacting to code they don’t control). I don’t see how you can possibly pretend to test that!

Posted by TFF | Report as abusive
 

A financial transactions tax is indeed the perfect remedy for this and desirable for other reasons as well.

Posted by SpringTexan | Report as abusive
 

Looks like Knight are now going to go bankrupt or be sold. This will impact the number of people wanting to invest in HFT companies quite a bit with such visible downside.

Posted by JustinCormack | Report as abusive
 

I do like the idea of a financial transactions tax, though I’m unclear on how that would impact “market makers”? Would the transactions be rewritten to directly match buyers and sellers? Would Official Market Makers be exempt?

Amazing that Knight managed to wipe out 3-4 years of earnings and a third of their equity capital with a single mistake. Somebody won’t be getting a bonus this Christmas! The dollar amount isn’t quite up to the London Whale, but the firm is much smaller.

Posted by TFF | Report as abusive
 

I don’t understand at all why people like the financial transaction tax. How is trading once per hour more socially undesirable than trading once per month? Why would we try to regulate markets with such a blunt instrument?

Instead, let’s make people responsible for their trades. If you program your computer such that it sells IBM at 0.01, well, looks like you should have programmed your computer better and are going to have to come up with some money. Secondly, if your behavior causes exchanges to malfunction and be unable to deliver accurate quotes (as in flash crash), let’s target that behavior and ban it. But on a day-to-day basis, even high-frequency markets can work fine as long as we all appreciate that stocks are going to bounce around and we aren’t entitled to have them smoothly compound every day at a 10% annual rate.

Posted by najdorf | Report as abusive
 

@najdorf, trading once per hour probably doesn’t create any serious issues. But when you are trading dozens of times a MINUTE, the speed of the transactions clearly outpaces any possibility of human control.

I like the idea of making people responsible for their trades (the Flash Crash trades should never have been canceled), but when these issues crop up several times a year I have to question your assertion that “even high-frequency markets can work fine”. Trades that bring down firms add to the systemic instability.

And personally I love volatility. It is very profitable for somebody who trades on fundamentals. Gotta wonder about the chumps who are taking the other side of those trades, though.

Posted by TFF | Report as abusive
 

I wrote: “If systems cannot be thoroughly tested due to the nature of the business”
TFF wrote: “Doesn’t that apply in this case?”

I think we are in agreement. I do not believe that computer trading should exist, period. These clowns have proven that the nature of the industry is indeterminate. It is not remotely possible to test all of the variables, not to mention that variables continue to be added.

But then again, I believe that hedge funds and just about every other company on Wall Street are parasites.

You are correct that my Newegg example was not nearly as complicated as computer trading. A better example would have been Amazon, which has become fairly complex, but things are still sometimes broken. If people cannot buy on Amazon, no big deal, but if the markets crash, big deal.

Posted by baroque-quest | Report as abusive
 

TFF wrote: “I like the idea of making people responsible for their trades”

I’ll go further. I want a financial transactions tax, but that is not enough. If a company crashes the market, I want management to be held personally responsible and not the slap-on-the-wrist nonsense that we see from the SEC. The corporate veil should not just be pierced, but burned away. Crashing the market costs a bundle and I want the assets of the company AND management to be used to make people whole. If that forces management into chapter 7 bankruptcy, too bad, so sad.

Posted by baroque-quest | Report as abusive
 

Complex systems are bound to become unstable. Interconnectivity creates conditions for “Dragon Kings”, Black Swans, Tail Events, however yo want to call them. The phenomenon it’s not circumscribed to the IT world. We’ve reach criticallity and complexity levels that breed instability.
http://thechinonomist.blogspot.com/2012/ 04/as-delicate-as-butterfly.html

Posted by jCarl | Report as abusive
 

My dissertation research was on intelligent packet routing algorithms for large scale networks. For this experience and for several famous computing failures in recent history, I have become a fan of the human in the loop.

Posted by Curmudgeon | Report as abusive
 

This and other topics that are relevant for speed traders and institutional investors will be discussed at High-Frequency Trading Leaders Forum 2013 London, next Thursday March 21.

Posted by EllieKim | Report as abusive
 

Post Your Comment

We welcome comments that advance the story through relevant opinion, anecdotes, links and data. If you see a comment that you believe is irrelevant or inappropriate, you can flag it to our editors by using the report abuse links. Views expressed in the comments do not represent those of Reuters. For more information on our comment policy, see http://blogs.reuters.com/fulldisclosure/2010/09/27/toward-a-more-thoughtful-conversation-on-stories/
  •