Ecologists understand the exploit/explore distinction. When an animal looks for food, it can either exploit (use previous knowledge of where food is) or explore (try to learn more about where food is). With ants, the difference is visible. Trail of ants to a food source: exploit. Solitary wandering ant: explore. With other animals, the difference is more subtle. You might think that when a rat presses a bar for food, that is pure exploitation. However, my colleagues and I found that when expectation of food was lower, there was more variation — more exploration — in how the rat pressed the bar. In a wide range of domains (genetics, business), less expectation of reward leads to more exploration. In business, this is a common observation. For example, yesterday I read an article about the Washington Post that said its leaders failed to explore enough because they had a false sense of security provide by their Kaplan branch. “Thanks to Kaplan, the Post Company felt less pressure to make hard strategic choices—and less pressure to venture in new directions,” wrote Sarah Ellison.
Striking the right balance between exploitation and exploration is crucial. If an animal exploits too much, it will starve when its supply of food runs out. If it explores too much, it will starve right away. Every instance of collapse in Jared Diamond’s Collapse: How Socieities Choose to Fail or Succeed was plausibly due to too much exploitation, too little exploration (which Diamond, even though he is a biologist, fails to say). I’ve posted several times about my discovery that treadmill walking made studying Chinese more pleasant. I believe walking creates a thirst for dry knowledge. My evolutionary explanation is that this pushed prehistoric humans to explore more.
I have never heard an economist make this point: the need for proper balance between exploit and explore. It is relevant in a large fraction of discussions about how to spend money. For example, yesterday I listened to the latest EconTalk podcast, a debate between Bob Frank and Russ Roberts about whether it would be a good idea for the American government to spend $2 trillion on infrastructure projects (fix bridges, etc.). Frank said it would create jobs, and so on — the usual argument. Roberts said if fixing bridges was such a good idea, why hadn’t this choice already been made? Roberts could have said, but didn’t, that massive government shovel-ready expenditures, such as $2 trillion spent on infrastructure repair, inevitably push the exploit/explore balance toward exploit, which is dangerous. This is an argument against all Keynesian stimulus-type spending. I have heard countless arguments about such spending. I have never heard it made. If you want examples of how the American economy suffers from a profound lack of useful new ideas, look at health care. As far as I know, there are no recorded instances of a society dying because of too much exploration. The problem is always too much exploitation. People at the top — with a tiny number of exceptions, such as the Basques — overestimate the stability of their position. At the end of The Economy of Cities, Jane Jacobs says that if a spaceship landed on Earth, she would want to know how their civilization avoided overexploitation. When societies exploit too much and explore too little, said Jacobs, problems (in our society, problems such as obesity, autism, autoimmune disease, etc.) stack up unsolved. Today is China’s birthday. Due to overexploitation, I believe China is in even worse economic shape than America. Ron Unz, whom I respect, misses this.
My broad point is that a lot of economic thinking, especially about growth and development, is one-dimensional (measuring primarily growth of previously existing goods and services — exploitation) when it should be two-dimensional (measuring both (a) growth of existing stuff and (b) creation of new goods and services). Exploration (successful exploration) is inevitably tiny compared to exploitation, but it is crucial there be enough of it. If there is a textbook that makes this point, I haven’t seen it. An example of getting it right is Hugh Sinclair’s excellent new book Confessions of a Microfinance Heretic (copy sent me by publisher) that debunks microcredit. Leaving aside the very high interest rates, the use of microcredit loans to buy TVs, and so on, microcredit is still a bad idea because the money is, at best, used for a business that copies an existing business. (The higher the interest rate, the less risk a loan recipient dares take.) When a new business copies an already-existing business, you are taking an existing pie (e.g., demand for milk, if the loan has been used to buy a cow and sell its milk) and dividing it into one more piece. The pie does not get bigger. As Sinclair says, the notion that dividing existing pies into more pieces will “create a poverty-free world” is, uh, not worthy of a Nobel Prize.
Sure, it’s hard to measure growth of useful knowledge. (It is perfectly possible for a company to waste its entire R&D budget.) However, I am quite sure that realism does better than make-believe — and the notion that growth of GDP is a satisfactory metric of economic growth is make-believe. If you’ve ever been sick, or gone to college, and have a sense of history, you will have noticed the profound stagnation in two unavoidable sectors (health care and education) of our economy. That are growing really fast.