In April 2008 Markopolos tried again, with an e-mail message to Jonathan S. Sokobin the new head of the commission’s office of risk assessment. “Attached is a submission I’ve made to the S.E.C. three times in Boston,” he wrote. “Each time Boston sent this to New York. Meagan Cheung, branch chief, in New York actually investigated this but with no result that I am aware of. In my conversations with her, I did not believe that she had the derivatives or mathematical background to understand the violations.”
The NYT authors are all over this:
How does this happen? How can the person in charge of assessing Wall Street firms not have the tools to understand them? Is the S.E.C. that inept? Perhaps, but the problem inside the commission is far worse — because inept people can be replaced. The problem is systemic. The new director of risk assessment was no more likely to grasp the risk of Bernard Madoff than the old director of risk assessment because the new guy’s thoughts and beliefs were guided by the same incentives: the need to curry favor with the politically influential and the desire to keep sweet the Wall Street elite.
Maybe their analysis is right - but personally, I'm not sure that wishful thinking and incompetence are the whole story. So much of life has become so complex that it's poorly understood, and we seem to mostly accept this. Case in point: I have a PhD in computer science and about 25 years experience, and I've done everything from soldiering together microprocessors to writing compilers to proving impossibility theorems. Yesterday I spent 45 minutes fixing the wireless on my laptop - and I don't understand why what I did worked. I could in principle, I'm sure - but that would take even more time, which I'd rather spend doing other stuff.
So I can readily imagine the director of risk assessment saying: gosh, this is complicated. I don't understand it very well. Some experts say it's a problem, some say it's not...hmm, what should I do? What should he do? if you're in this situation every week, or every month, how do you decide what to dig into, and what to ignore? How do you tell the difference between what you think is true, and what you hope is true? How many "experts" are in this situation every week, or every day?
Today's thought is: how should you handle complexity of this sort? because there is a right way and a wrong way. If I was to boil the right way down to a few words, they would be:
- Study the data. Always start with what you know - the data. Plot it, graph it, run your favorite models on it, look at the outliers. After you've hacked at it for a while, you'll start to get a feel for what it's like. Madoff was an outlier in several respects - rate of return for the last few years, but also organizationally. If you're interested in modeling the market, you'd want to discount him as representative - if you were interested in detecting fraud, outliers are the things you want to watch most closely.
- Do the math. Complexity is your enemy, and good math is your best friend and most powerful weapon against it. (There is of course evil math, which makes the simple complex instead of making the complex simple, but hopefully you're smart enough to recognize the difference.)
- Ask questions. When you don't understand something, ask questions until you do. If you do understand something, think it through a little more, until you find a question...you'll usually find out that either you understand it a little better after you do...or sometimes, that nobody really understands it.
Today's deep thought is: how much of modern computer science is about making the complexities in our world simpler and more understandable - and how much is making the world more complex and more opaque? And how complicated can a system be and still be controlled, maintained, and regulated by human beings?