The History of Human Chromosome Number Reveals Constraints on Professional Scientists

Why does personal science matter? One reason, as I’ve said many times, is that personal scientists (who do science to help themselves) are free to speak the truth. Sometimes professional scientists (for whom science is a job) are not.

The history of human chromosome number is a good example. Starting in the 1920s, humans were said to have 48 chromosomes. In fact, the correct number is 46. From the soon-to-be-published book The Truth in Small Doses by Clifton Leaf (copy sent me by publisher), which is about cancer research, I learned that in 1955 two Swedish scientists, Tjio and Levan, established the correct number. After their article appeared,

Several researchers wrote [them] to confess that they, too, had spied only forty-six chromosomes but had thrown out the results because they were in conflict with established knowledge.

“In conflict with established knowledge” was euphemism for we were worried what would happen to us.

The Truth in Small Doses begins with this story. Leaf’s point is that cancer researchers have a similar problem: They too cannot tell the truth, which is that progress against cancer has been poor, in spite of billions of dollars spent on research.

6 thoughts on “The History of Human Chromosome Number Reveals Constraints on Professional Scientists

  1. “progress against cancer has been poor”: with the exception, I understand, of useful progress on some childhood cancers.

    Seth: That’s right. Because those cancers are rare, it doesn’t do much to change the overall record. The other exception is lung cancer. Lung cancer has gone down a lot since it was figured out that it is caused by smoking. Leaf doesn’t mention this.

  2. From Richard Feyman’s graduation speech given at Cal Tech in 1974 (as quoted on this Wikipedia page):

    We have learned a lot from experience about how to handle some of the ways we fool ourselves. One example: Millikan measured the charge on an electron by an experiment with falling oil drops, and got an answer which we now know not to be quite right. It’s a little bit off because he had the incorrect value for the viscosity of air. It’s interesting to look at the history of measurements of the charge of an electron, after Millikan. If you plot them as a function of time, you find that one is a little bit bigger than Millikan’s, and the next one’s a little bit bigger than that, and the next one’s a little bit bigger than that, until finally they settle down to a number which is higher.

    Why didn’t they discover the new number was higher right away? It’s a thing that scientists are ashamed of – this history – because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong – and they would look for and find a reason why something might be wrong. When they got a number close to Millikan’s value they didn’t look so hard. And so they eliminated the numbers that were too far off, and did other things like that

    Seth: Yes, another good example.

  3. I don’t think there is necessarily always a conspiracy in favor of the status quo. People just tends to cling to whatever they have learned about first. It is expensive to change your mind.

    There is no doubt that “we were worried what would happen to us” is sometimes the right explanation, but it is not always the case. The phenomenon you describe happens even when nobody has a strong incentive to preserve the status quo.

  4. To expand on what Daniel Lemire wrote – Thorson Veblen coined the term “trained incapacity” to refer to the inability of people to see solutions to problems when the solution exists outside the training or educational background of the solution seeker.

    A story, probably apocryphal, was that Veblen bolted a pipe to a table and then dropped a small ball, like a ping-pong ball, into the pipe. On a nearby table Veblen had laid out some tools, as well as normal amenities – water, light snacks, etc. Veblen then invited some engineers to get the ball out of the pipe. The provided hand tools were (deliberately) inappropriate for the task so the engineers failed.
    Veblen then asked a farmer to get the ball out of the pipe. The farmer promptly took the water pitcher, poured water into the pipe and the ball floated to the top.

    I have no doubt that some scientists are afraid to speak the truth for fear of damaging their reputations but I also have no doubt that some scientists (and politicians, educators, citizens, business people, etc.etc.) simply can not see or accept existing reality because of “trained incapacity”

    All the more reason to do personal science and “crazy” experiments

    A good, short explanation of the phenomenon by the late, great Herman Kahn can be found here;

    https://www.hudson.org/index.cfm?fuseaction=publication_details&id=2219

    Seth: I agree. That’s another reason to support personal science: To open up the investigation of X (e.g., cancer) to people who use tools other than the tools of professional scientists. The story sounds too practical for Veblen.

  5. The Millikan story is a bit clouded, though, by the fact that he fiddled his numbers anyway (or so I understand).

Leave a Reply

Your email address will not be published. Required fields are marked *