contained in the whole foods we’re not eating.^2 As a result,
90 percent of Americans now fall short in obtaining
adequate amounts of at least one vitamin or mineral.^3
To complicate matters, nutrient intake guidelines are set
only to avert population deficiencies. This means that even
when we check all the institutionally recommended boxes,
we may still be handicapping our bodies in serious ways.
The recommended daily allowance (RDA) of vitamin D, for
example, is meant only to prevent rickets. But vitamin D
(generated when our skin is exposed to the sun’s UVB rays)
is a steroid hormone that affects the functioning of nearly
one thousand genes in the body, many involved in
inflammation, aging, and cognitive function. In fact, a
recent University of Edinburgh analysis found low vitamin
D to be a top driver of dementia incidence among
environmental risk factors.^4 (Some researchers have argued
that the RDA for vitamin D should be at least ten times
higher than it currently is for optimal health.)^5
When our bodies sense low nutrient availability, what’s
available will generally be used in processes that ensure our
short-term survival, while long-term health takes a back
seat. That’s the theory initially proposed by noted aging
researcher Bruce Ames. Dubbed the “triage theory” of
aging, it’s sort of like how a government may choose to
ration food and fuel during wartime. In such cases, more
immediate needs such as food and shelter might take
priority, whereas public education would become a casualty.
In the case of our bodies, loftier repair projects can become
an afterthought to basic survival processes, all while pro-