Regulation and the day the machines took over -The Scott McCleskey Report

By Guest Contributor
October 13, 2010

HIGHFREQUENCY/By Scott McCleskey, Complinet

It took five months, a PhD in Physics, a Nobel Prize winner and a staff of quants, but the SEC and CFTC have now figured out what happened to the markets during the “flash crash” in May. Given the well-orchestrated string of sneak-peeks the SEC had given before the publication of the joint report,  the findings weren’t particularly surprising. Nevertheless, they are enlightening both for what they tell us about the state of the markets and for what they tell us about the assumptions we have made when regulating them. The upshot: markets aren’t efficient, and rulemakers should stop acting as if they are.

This is important because regulatory reform is now going in two directions at once. Many of the Dodd–Frank provisions continue the deregulatory approach of simply improving “transparency” by getting more data into the market. This approach implicitly assumes that markets work efficiently in the way textbooks have described them since the days of Adam Smith, and that all markets really need is more information. While it’s true that you can’t regulate a market that isn’t transparent, it’s foolish to believe that the $600trn credit derivatives market, for instance, will spontaneously regulate itself once its participants begin reporting their activities.

The second approach to regulatory reform has been to identify and then ban or restrict practices inimical to a stable market. This approach is reflected in SEC and CFTC moves to ban flash orders, place tighter curbs on short selling, and impose potential limits on the automated execution of large orders. Unlike the passive approach of regulation-by-transparency, this approach is active and assumes that markets are inefficient.

In spite of real-world evidence that markets are in fact inefficient, our existing regulatory environment assumes otherwise, placing more emphasis on a two-hundred year old theory than on empirical evidence. While the principles of supply and demand still apply generally, they can’t be applied without modification to modern markets. If they could, the rush toward computerized trading would be a great thing. More trades would mean more liquidity, meaning more interaction and better price discovery. High Frequency Trading (HFT) would make the markets approach perfect efficiency. But we’ve learned that markets don’t always benefit from a tidal wave of information. Yes, HFT certainly provides liquidity – but it also soaks it up as the algorithms duke it out, executing thousands of trades in a few seconds. The net effect is more volume but not more liquidity since the trades are simply batting orders around in a circle –– as happened on May 6.

So the lesson of the flash crash is that we have come to a point where increasing the amount and speed of data has diminishing or even negative returns on market efficiency. It is a market in which computer logic has taken over from human judgment. And the computers that have taken over the market aren’t the coldly omniscient machines that take over the world in science fiction. Even if their programmers have brains the size of watermelons, trading algorithms are vulnerable to the assumptions programmed into them, including the assumptions about what the other algorithms are assuming. They cannot keep up with this crushing volume of data, and human beings are just along for the ride. Compound this with a decentralized market structure that fragments liquidity but transmits risk, complex instruments and opaque counterparty entanglements, and we reach a point where the natural state of markets is controlled chaos. No one can accurately judge their own exposure much less that of their potential counterparties, setting the stage for another system-wide run on the bank as occurred with the fall of Lehman.

Such a market cannot be regulated with the same approach, tools and resources as were sufficient a few years ago. Regulations need to recognize the markets as they really exist: full of noise and chaos, not order and efficiency. That means the end of outsourcing regulation to theoretical market forces. To be effective in the markets of  this century, regulation must be active in identifying and restricting practices which tend toward destabilizing the markets, as well as innovations designed merely to game the system. Regulation must now be based on an Inefficient Market Hypothesis.

Regulators and exchanges must also be armed to the teeth with the technology required to keep up with today’s hypermarkets. They will never be able to exercise their oversight responsibilities if they don’t come to work each day with the same level of technology as those whom they are supposed to supervise. They would be like traffic cops standing in the middle of the track at the Indy 500. Exchanges need to have surveillance technologies that operate at the same speed as the trading programs, that communicate with each other, and that are able to call “time out” better than they did on May 6. Regulators need to establish departments which have the technology to digest and analyze the incoming flood of newly transparent data and make assessments in real time, not five months later. IT must be a core function of the regulators, not a support function. The technology and talent to do these things already exist, but they can’t be bought with a Tea Party-size budget. Congress has to hold its nose and write some big checks to the regulators, or (gulp) the regulators must impose fees on high frequency trading to help offset the cost of policing them. Lastly, the regulators will need to come up with a way to keep the talent they recruit to look after this function.

In the end, the flash crash report identified as culprits the very characteristics that distinguish modern markets from those of our Twentieth Century ancestors, namely the outsourcing of trading decisions and market discipline to technology. If we think we can regulate these markets with Twentieth Century rules reflecting Eighteenth Century theory, we will come to rue the day the machines took over.

Scott McCleskey is managing editor, North America, at Complinet and is the author of When Free Markets Fail: Saving the Market When It Can’t Save Itself (John Wiley and Sons). The views he expresses in this column are his own and do not necessarily reflect those of Complinet or its parent, Thomson Reuters Inc.

Complinet, part of Thomson Reuters, is a leading provider of connected risk and compliance information and on-line solutions to the global financial services community. Established in 1997, Complinet serves over 100,000 industry professionals in 80+ countries. Our connected approach provides one single place to get all the relevant regulatory news, analysis, rules and developments from the region to support firms in highly regulated industries.

No comments so far

We welcome comments that advance the story through relevant opinion, anecdotes, links and data. If you see a comment that you believe is irrelevant or inappropriate, you can flag it to our editors by using the report abuse links. Views expressed in the comments do not represent those of Reuters. For more information on our comment policy, see http://blogs.reuters.com/fulldisclosure/2010/09/27/toward-a-more-thoughtful-conversation-on-stories/

[...] out, executing thousands of trades in a few seconds. The net effect is more volume but not… Financial Regulatory Forum This entry was posted in Global News and tagged machines, McCleskey, Over, regulation, Report, [...]