 If everyone took 2,000 units of vitamin D a day, it could shift the curve from average blood levels in the mid-50s to about 110, which some estimate could add years to our life expectancy. Data derived from randomized clinical trials have convinced some influential experts, such as Harvard's Chair of Nutrition, that we should shoot for this kind of range. Levels that about 9 out of 10 people fail to reach, because it may necessitate taking between 1,800 and 4,000 units a day. The Institute of Medicine, however, considered blood levels of 50 to be sufficient, and therefore only recommended 6 to 800 units a day for those with little or no sun exposure, because they were only considering bone health. But even if you just cared about your bones and not your lifespan, you'd still probably want to shoot for the 75 threshold because there's evidence like this from hundreds of autopsies of people that like died in car accidents showing osteomalacia, softening of the bones, and between 18 and 39% of people that reached the Institute of Medicine target, but fail to make it to 75. There's even been a charge that the Institute of Medicine simply just made a mistake in their calculations, and so using their own criteria, they should be recommending thousands of units a day as well. But the mere absence of soft bones can hardly be considered an adequate definition, either of health or of vitamin D sufficiencies, like saying you only need 10 milligrams of vitamin C to avoid scurvy. Yeah, but we need way more than that for optimal health. The Institute of Medicine took the position that the burden of proof fell on anyone who claimed benefits for intake higher than their minimal recommendations, which is a good approach for drugs for unnatural substances. Less is more until proven otherwise. But for nutrients, shouldn't the starting point at least be the natural levels to which our bodies have become finely tuned over millions of years? The target level of 75 only sounds high compared to average levels today. But in modern times, we practice unnatural activities, like working at a desk job, or sometimes even wearing clothes. We evolved running around naked in equatorial Africa, getting sun all day long. If you measure vitamin D levels in those living traditional lives in the cradle of humanity, a normal vitamin D level would be over 100. So maybe that should be the starting point until proven otherwise. A concept regrettably many guidelines committee seems to have ignored. Now look, the natural level isn't necessarily the optimal level. Maybe the body would have thrived with less. So you still have to look at what levels correspond to the lowest disease rates. And when you do, the highest levels do indeed seem to correlate with less disease. You know what always struck me when I was doing pediatrics? That breast-fed babies required vitamin D drops. I mean, shouldn't human breast milk be a perfect food? Of course, for the medical profession, the solution is simple. Provide the baby supplements, the drops. But it seems like we shouldn't have to. Right? It should be perfect. But look, you measure human breast milk these days, and it has virtually no vitamin D, and would cause rickets. Unless the mom had vitamin D levels up around, you guessed it, the level natural 4-hour species, which of course makes total sense. So it's just like an environmental mismatch, the way we live in our modern world. It helps to think of vitamin D as what it truly is, a hormone, not a vitamin. So if you think of it like that, then it would be reasonable to have normal levels. We physicians try to maintain blood pressure and all sorts of parameters within normal limits, but why is so little attention paid to the status of the hormone vitamin D?