When dumb bank regulation is a good idea

By Felix Salmon
April 30, 2009

Justin Fox has a great post up on models of regulation, linking a comment from Gary Becker (“When you give a lot of discretion to regulators, they don’t use the tools that are given to them”) to a theory from Matt Yglesias that when it comes to regulation, it’s important not to try to be particularly clever or sophisticated. Where there’s a serious systemic risk, says Yglesias, we should “lean in with a heavy hand”: a satisficing solution which makes no bones about the fact that it’s suboptimal, but which is based on the insight that when you’re pushing the envelope of optimality, a regulatory oversight or mistake can be vastly more damaging than when you have crude and simple rules in place.

In the perennial debate between rules-based and principles-based regulation, this is an argument in favor of the former, while I’m generally in favor of the latter. But Becker’s right: principles means discretion, and discretion means danger.

What I would do, then, is implement a largely discretionary principles-based approach to bank regulation, but pair it with one or two heavy-handed rules: a cap of $300 billion on total assets, say, along with increasingly stringent tier-1 capital requirements the larger a bank gets, based very simply on total assets rather than on clever Basel II risk weightings. The weight of avoiding huge systemic risks would then be borne largely by the big, dumb rules, leaving the rest of the regulatory function to deal with smaller-scale issues on a more flexible, case-by-case, and intelligent basis. Sure, there would still be the risk of regulators getting things wrong or being blinded by science, but the downside would be much smaller.

2 comments

Comments are closed.