In an article in the Financial Times, Nassim Taleb and Pablo Triana write:
Risk methods that failed dramatically in the real world continue to be taught to students in business schools, where professors never lose tenure for the misapplications of those methods. As we are writing these lines, close to 100,000 MBAs are still learning portfolio theory — it is uniformly on the programme for next semester. An airline company would ground the aircraft and investigate after the crash — universities would put more aircraft in the skies, crash after crash.
Years ago, a cousin of mine was fond of saying something similar. He was majoring in English at UCLA. He didn’t think much of his professors. “What happens when a professor is wrong?” he would ask. “When an engineer is wrong, the bridge falls down. When a doctor is wrong, the patient dies. What happens when an English professor is wrong?” The answer, of course, was “nothing”. Now we will find out what happens when finance professors are wrong.
Thanks to Dave Lull.
Seth,
I made a similar point recently with the eye of a nuclear engineer:
https://nuit-blanche.blogspot.com/2008/10/trust-but-verify.html
In particular, in no other engineering fields, one would have been allowed to go on for so long without some sorts risk mitigation strategy. As you know, risk is not the just the probability of something happening but the product of that probability and the consequences that it entails. With that in mind and the fact that we already had near collapses before (LTCM) I am at loss to see how this is not taught and more importantly enforced at some level. In the Nuclear Engineering curicullum, we have specific classes dedicated to risk analysis and to the study of all previous accidents. After their studies, folks can either go into design/production/safety analysis or in regulation enforcement. In the world described in this NYTimes, it looks like everybody goes into “production”. None of the best minds seem to choose enforcement as there is no incentive in the system for that.
Igor.
That’s a good way to put it, Igor. I read your post and I have a question: In the phrase “trust but verify” what does “trust” mean?
I think “trust” means you don’t assume bad faith, but you have to “verify” because it’s too important to leave to good faith assurances.
James does a good job of defining it.
I think Reagan stole the sentence from Thatcher. In the context I use it, the phrase was coined when the U.S and the then U.S.S.R were checking on compliance for a treaty (maybe SALT II). It took the form of U.S. teams going to the U.S.S.R to check if a specific missile was being dismantled and vice versa. The warheads were not destroyed per se but the teams were somehow making sure that the missile being dismantled in front of them was not made of dummy warheads.
Trust was even further boosted when the foreign team would decide by the flip of the coin which missile to check and find out that indeed it was what it was supposed to be.
Igor.
Part of the problem is that “experts” have trouble evaluating the state of their knowledge and thus, how seriously they should take their claims. Gary Taubes has made it clear, to me at least, that nutritional “experts” way over-authoritize themselves and know far less than they claim. If you read what 19th century astronomers have to say about the universe the problem is not that they don’t know more given their tools, but that they take what they say with far too much confidence.
Economics really is the dismal science, grounded totally in correlations and fancy math and theories. No one should take their theories too seriously. Or think of very smart psychoanalysts who thought they could perceive psychological causality and construct the truth of why people have psychological disorders — it was all nonsense.
Part of the problem is social-psychological: people in positions of authority have trouble saying, “The truth is I don’t know, but this is the best I can do right now.” They would lose their authority if they told the truth, both in their own eyes and the eyes of others…
James, would someone who believed only “verify” act differently than someone who believed “trust but verify”? The term “trust” seems to have no concrete consequences.
Seth: Assuming bad faith is likely to get you a lot less cooperation from the people you’re trying to correct. In some cases, eg a police investigation you have the power of law backing you up, but in most others you are at least somewhat reliant upon the behaviour of the people you’re investigating. If you don’t make it clear you trust them, they’re far more likely to become hostile and hinder you. The Wikipedia guideline covers it a bit more, though not as much as I thought it did.
Thanks, James, but I’m still having a hard time visualizing what you are talking about. When Reagan said “trust but verify” what did he do that showed he “trusted”? If you don’t like the Reagan example, choose another example of the use of that phrase.
Seth,
In the case I was mentioning, there were all sorts of possibilities to cheat the other party. In effect, you could not have the other party’s warheads in your hands or in your labs. In fact you could not even see them when inspecting the site. You could only see the missile in which they were hosted and the only information you had was the one you derived from your own instruments that you brought with you. In other words, if your detectors were detecting something you thought was a warhead then it was a warhead, even though you did not actually have an absolute proof of it. Furthermore, you were bringing your own instrumentation in that foreign country and as you know, the detector tells people more about what you know and don’t know. As you can see, in a hierarchical layer of secrets and goodwill, a simple Yes or No from a written document couldn’t enforce that trust. In the end, with all the verification they allowed the other party to perform, they still had to resort to some type of trust.
Now that i think about it, the argument resembles very much the zero-proof knowledge argument explained by Bernard Chazelle in this Nature paper:
https://www.cs.princeton.edu/~chazelle/pubs/nature07.pdf
In other words, trust comes from the ability to ask several questions to one’s satisfaction without ever knowing the underlying proof.
Reagan used the term, I think, to differentiate it from the generic trust imparted by treaties signing where only accidents would reveal non-compliance by some parties.
Igor.
A bit of humor in the form of an aphorism from the Sufi tradition:
Trust in God, but tie your camel first.
A bit of anecdotal evidence. I went to a talk titled “The Role of Statisticians in Quantitative Finance before and after the Crash”, given on November 5th by an adjunct professor at UC Berkeley. The talk was mostly about how great portfolio theory was. The only mention of the crash was about how lucky the speaker was to have left a local risk management firm just weeks before the crash.
Thanks, Igor
When English Professors are wrong, you can get very damaging affects in your culture. Not nothing.
Anthony, my cousin meant there is no corrective feedback when English professors are wrong. But in any case your comment is intriguing. What are the “very damaging effects”?
Well, the study and promotion of literature is a central aspect of a society’s culture. When you have a bunch of English Professors buying into, say, bad psychology, this ripples out into the culture at large, causing dysfunction (i.e., pain, suffering, sadness, anxiety, and so on) where it need not be.
When doctors are wrong, a patient can be harmed. When figures in or related to literature are wrong, a culture or people can be harmed. There is an effect, and so there is potential corrective feedback, but there might not be as much of a sense of responsibility in the latter …
One of the real problems is that the applied statistics crowd and the time and motion guys were so successful, across genres and disciplines. All economists since have wanted to make the same kind of splash, to be experts in all areas like the major textbook writers once were (mostly due to specialized math skills that got them involved in NASA and a lot of other areas).
Interesting, isn’t it, how much of what is taught is really the trend or fad of the month (so to speak) rather than science.