Cold Shower Report (2)

After learning that cold showers can raise mood, I started taking cold showers. The mood improvement was hard to notice but it was easy to notice that I became more comfortable in the cold. My apartment seemed warmer.

To increase the effect, I increased the water flow (by unplugging shower-head holes that were clogged) and lowered the water temperature (running the water several minutes before starting the shower). The water was obviously colder and its effects larger. Now the showers did raise my mood, for maybe an hour. It was curious how they were unpleasant for only a second.

After a week or so of the colder showers, it became clear, alas, that my weight was increasing. I gained about 2 pounds. There was no obvious explanation for this other than the cold showers. I hadn’t changed my diet in a big way. I hadn’t changed my activities. And there is plenty of evidence that skin temperature controls body fat. For example, a study of three types of exercise (stationary bike, walking, and swimming) in women found that the women who biked and walked lost weight but the women who swam did not, in spite of equal fitness improvement. So I have stopped the cold showers.

Self-Experimentation as Legal Gambling

Listening to Freakonomics Radio on lottery-like savings accounts reminded me of a big reason I self-experiment: it resembles buying a lottery ticket. Whenever you collect data I believe there is a power-law distribution of benefit: large chance of little benefit, small chance of large benefit. (Your sophistication and other things affect the slope.) Almost all data confirms what you already knew — small benefit. A very small fraction of data gives you a new idea — large benefit. Because self-experimentation is about oneself, new ideas can have tangible benefits, just as winning the lottery provides tangible benefit (money)

Basically I hope for outliers — a sudden jump up or down in something I’m tracking, such as arithmetic speed or sleep duration. This may give me a new idea about what controls that measure. Self-experiments are also valuable because something I’m not deliberately measuring may change. When I started watching faces on TV in the morning to see if it would affect my sleep, my mood, which I wasn’t deliberately measuring, suddenly improved.

It really does feel like playing the lottery for free. To not make a measurement I could easily make feels like walking by a perfectly good lottery ticket lying in the street. Loss aversion sets in.

Walking and Learning in Rats

Yesterday I blogged that walking on a treadmill made studying flashcards enjoyable. I also felt my retention was noticeably better than when I studied sitting or standing in one place.

Thanks to Matt Weber I learned of a rat experiment that supports the idea I was more retentive. Long-term potentiation (LTP) is a long-lasting (hours) change in synapse properties caused by a certain type of electrical stimulation from electrodes. Leung et al. measured the amount of LTP produced by the electrodes when rats were in one of four states: (a) walking, (b) immobilized, (c) short-wave sleep, (d) rapid-eye-movement sleep. They found clear LTP in all four states, but the LTP was much larger (50% larger?) when the rats were walking during the stimulation. During the other three states the LTP was about the same.

The walking and immobilization conditions must have differed in many ways. Perhaps immobilization was uncomfortable. Perhaps it required more handling. And so on. Comparing just those two states, you might wonder if (a) walking produces changes that cause things to be remembered better or (b) any of the other walking/immobilization differences made things worse (e.g., the shock of handling reduces learning). The fact that immobilization and the two sleep states produced similar results argues against the second sort of explanation.

Walking Creates A Thirst For Dry Knowledge

A few weeks ago I got a treadmill for my Beijing apartment. Two days ago I was walking on it (I try to walk 1 hr/day) while watching Leverage to make the activity more palatable. But Leverage bored me. It was too simple. So I took out some Chinese flashcards (character on one side, English and pinyin on the other) and started studying them. I was astonished how pleasant it was. An hour of walking and studying went by . . . uh, in a flash. In my entire life I have never had such a pleasant hour studying. The next day it happened again! The experience appears infinitely repeatable. I’ve previously mentioned the man who memorized Paradise Lost while walking on a treadmill.

I’ve noticed before that treadmill walking (by itself boring) and Chinese-character learning (by itself boring) become pleasant when combined. So why was I astonished? Because the increase in enjoyment was larger. The whole activity was really pleasant, like drinking water when thirsty. When an hour was up, I could have kept going. I wanted to do it again. When I noticed it earlier, I was using Anki to learn Chinese characters. Now I am using flashcards in blocks of ten (study 10 until learned, get a new set of 10, study them until learned . . . ). The flashcards provide much more sense of accomplishment and completion, which I thinks makes the activity more pleasant.

My progress with Chinese characters has been so slow that during the latest attempt (putting them on my wall) I didn’t even try to learn both the pinyin and the meaning at the same time; I had retreated to just trying to learn the meaning. That was hard enough. I have had about 100 character cards on the walls of my apartment for a month but I’ve only learned the meaning of about half of them. No pinyin at all. In contrast, in two one-hour treadmill sessions I’ve gotten through 60 cards . . . including pinyin. For me, learning pinyin is much harder than learning meaning.

It’s like drinking water when you’re thirsty versus when you’re not thirsty. The walking turns a kind of switch that makes it pleasant to learn dry knowledge, just as lack of water creates thirst. Not only did studying dry materials become much more pleasant I suspect I also became more efficient — more retentive. I was surprised how fast I managed to reach a criterion of zero mistakes.

I had previously studied flashcards while walking around Tsinghua. This did not produce an oh-my-god experience. I can think of three reasons why the effect is now much stronger: 1. Ordinary walking is distracting. You have to watch where you’re going, there are other people, cars, trees, and so on. Distraction reduces learning. If the distractions are boring — and they usually are –Â the experience becomes less pleasant. 2. Ordinary walking provides more information than treadmill walking (which provides no information at all — you’re staring at a wall). The non-flashcard info reduces desire to learn what’s on the flashcards. 3. On these Tsinghua walks I had about 100 flashcards which I cycled through. Using sets of 10, as I said, provides more sense of accomplishment. I’ve also had about 20 Chinese-speaking lessons while walking around. The walking made the lessons more pleasant, yes, but it wasn’t nearly as enjoyable as the treadmill/flashcard combination. And because lessons with a tutor are intrinsically more enjoyable than studying flashcards, the increase in enjoyment was less dramatic.

As I said earlier I think there’s an evolutionary reason for this effect: The thirst for knowledge (= novelty) created by walking pushed us to explore and learn about our surroundings. One interesting feature of my discovery about treadmill and flashcards is that it may take better advantage of this mechanism than did ordinary Stone-Age life — better in the sense that more pleasure/minute can be derived. In the Stone Age, novelty, new dry knowledge, was hard to come by. You could only walk so fast. After a while, it was hard to walk far enough away to be in a new place. Whereas I can easily switch from flashcards I’ve learned to new ones. An example of a supranormal stimulus.

Different Effects of Omega-3 and Omega-6 on Heart Disease

You have probably read hundreds of recommendations to eat more polyunsaturated fatty acids (PUFAs), which in practice means omega-6 and omega-3. If you shop at Whole Foods, you may see Udo’s Blend, a blend of PUFAs which includes both omega-3 and omega-6, as if someone isn’t getting enough omega-6. It is unquestionable that omega-3 is beneficial but there is plenty of evidence that omega-6 is harmful, starting with the Israeli Paradox. Why are they lumped together?

A just-published paper in the British Journal of Nutrition makes several new points about the relation of PUFAs and heart disease. Its main point is a new look at experiments in which one group was given more PUFAs than another group. Those experiments — there are only about eight — can be divided into two groups: (a) experiments in which the treated group was given both omega-3 and omega-6 and (b) experiments in which the treated group was given only omega-6. The two groups of experiments seem to have different results. In the “both” experiments the treated group seems to benefit; in the “only omega-6″ experiments, the treated group seems to be worse off. Suggesting that omega-3 and omega-6 have different effects on heart disease. They have been lumped together because experiments have lumped them together (varied both at the same time).
Experiments that try to measure the effect of PUFAs usually say they are replacing saturated fats. More PUFAs, less butter. The paper points out that studies of the effect of PUFAs have at least sometimes confounded reduction in saturated fats with reduction in trans fats. Benefits of the change may be due to the reduction in trans fats, not the reduction in saturate fats.

The paper also makes several good points about the Finnish study, a classic in the fat/heart disease literature. Supposedly the Finnish study showed that PUFAs (replacing saturated fats) reduce heart disease. It had hundreds of subjects but they were not randomized separately. The subjects were divided by hospital. Everyone in one hospital got one diet, everyone in a second hospital got a different diet. This meant it was easy for there to be confoundings (i.e., the treatment wasn’t the only difference between the groups). Indeed, there were big differences in consumption of a certain dangerous medication and margarine between hospitals. (Margarine is high in trans fats.)

Perhaps the first author, Christopher Ramsden, who works at NIH, is responsible for the high quality of this paper.
Thanks to Susan Allport.

The Decline Effect and Kitty Kelley

A few posts ago I commented on Jonah Lehrer’s article about replication difficulties, which Lehrer called the decline effect. I concluded it was an indication how poorly science (truth-seeking) and profession (making a living) fit together. Scientists are always under pressure to do what’s good for their career rather than find and report the truth.

Journalism is another kind of truth-seeking. It has the same problem. Journalists are always under pressure to do what’s good for their career rather than find and report the truth. In an essay about unauthorized biographies, Kitty Kelley makes this point:

[Michael] Hastings said that reporters like [Lara] Logan do not report negative stories about their subjects in order to assure continued access. No reporter would admit to tilting a story toward favorable coverage to keep entrée, but they do, and that is one of the dirty little secrets of journalism today.

Just as no reporter admits this, I have never heard a scientist admit it, with two exceptions: 1. The inventor of the aquatic ape theory of human evolution (Alister Hardy) said he stopped talking about it to avoid hurting his career. It fell to a non-scientist (Elaine Morgan) to develop it. 2. In that famous graduation speech, Richard Feynman pointed out how the first determination of the charge on an electron used the wrong value for the viscosity of air and later determinations, which did not involve that viscosity, tended to confirm the mistaken value. Unfortunately, Feynman went on to say: “We’ve learned those tricks nowadays, and now we don’t have that kind of a disease.” As if human nature had changed.

I conclude that both science and journalism will work best with systems where amateurs and professionals both have substantial power. Kelley doesn’t mention that authorized biographers have important truth-seeking advantages over non-authorized ones (e.g., access to old letters).

Dr. Charles Nemeroff “Writes” A Textbook

The stench was too great. I learned from this article that Charles “Disgraced” Nemeroff, once one of the most respected psychiatry professors in America, has moved from Emory University (where he badly deceived university officials) to the University of Miami. The article tells of more Nemeroff dishonesty: He put his name on a textbook he didn’t write. This letter shows how the book was written. The words in the book came from a company named Scientific Therapeutics Information, whose fee was paid by GlaxoSmithKline. Scientific Therapeutics won’t answer questions about what it did. Nemeroff says he and his co-author “conceptualized this book, wrote the original outline and worked on all of the content.” Worked on, huh? Leslie Iversen, an Oxford professor of pharmacology, may have “worked on” the passages he plagiarized (a few words were changed) harder than Nemeroff and his co-author “worked on” their book. The New York Times added a correction to the article worthy of Wittgenstein: “While documents show that SmithKline (now known as GlaxoSmithKline) hired a writing company for the book, they do not indicate that the [writing] company wrote the book.”

In twenty years perhaps Nemeroff will forget that he “wrote” this book, just as the first President Bush forgot about a book he “wrote”.

Thanks to Alex Chernavsky.

The Decline Effect

A new article in The New Yorker by Jonah Lehrer is about declines in the size of experimental or quasi-experimental effects over time. For example, Jonathan Schooler, an experimental psychologist, found that if subjects are asked to describe a face soon after seeing it their later memory for the face is worse. As Schooler continued to study the effect, it appeared to get weaker. The article also describes examples from drug trials (a anti-psychotic drug appeared to become weaker over 15 years) and ecology (the effect of male symmetry on mating success got weaker over years).

It’s nice to see an ambitious unconventional article. I blogged a few weeks ago about difficulties replicating the too-many-choices effect. Difficulty of replication and the decline effect are the same thing. I could do what Jared Diamond does in Collapse: give a list of five or six reasons why this happens. (Judging by this paper, the effect, although real, is much weaker than you’d guess from Lehrer’s article.) For example, the initial report has much more flexibility of data analysis than later reports. Flexibility of analysis allows researchers to increase the size of effects.

A long list of reasons would miss a larger point (as Diamond does). A larger point is this: Science (search for truth) and profession (making a living) are not a good fit. In a dozen ways, the demands of a scientist’s job get in the way of finding and reporting truth. You need to publish, get a grant, please your colleagues, and so on. Nobody pays you for finding the truth. If that is a goal, it is several goals from the top of the list. Most jobs have customers. If a wheelwright made a bad wheel, it broke. Perhaps he had to replace it or got a bad reputation. There was fast powerful feedback. In science, feedback is long-delayed or absent. Only long after you have been promoted may it become clear anything was wrong with the papers behind your promotion. The main customers for science are other scientists. The pressure to have low standards — and thus appear better to promotion committees and non-scientists — is irresistible. Whereas if Wheelwright Y makes better wheels than Wheelwright X, customers may notice and Wheelwright Y may benefit.

There are things about making science a job that push scientists toward the truth as well, such as more money and time. When science is a job, a lot more research gets done. Fine. But how strong are the forces against finding truth? I was never surprised by the replication difficulties Lehrer writes about. I had heard plenty of examples, knew there were many reasons it happened. But I was stunned by the results of my self-experimentation. I kept finding stuff (e.g., breakfast disturbs sleep, butter improves brain function) that contradicted the official line (breakfast is the most important meal of the day, butter is dangerous). Obviously I had a better tool (self-experimentation) for finding things out. The shock was how many things that had supposedly been found out were wrong. Slowly I realized how much pressure career demands place on scientists. It is no coincidence that the person most responsible for debunking man-made global warming, Stephen McIntyre, is not a professional climatologist (or a professional scientist in any other area). Unlike them, he can say whatever he wants.

Thanks to Peter Couvares.

More In his blog, failing to see the forest for the trees, Lehrer says we must still believe in climate change (presumably man-made): climate change and evolution by natural selection “have been verified in thousands of different ways by thousands of different scientists working in many different fields.” Charles Darwin, like McIntyre, was an amateur, and therefore could say whatever he wanted.