spending as a percentage of GDP must be kept at or below a given
percentage—regardless of the funds needed ... This is the idea that is
unsustainable.”^39
Skeptics can rightly criticize the poor value Americans receive for their
healthcare dollars when compared to most Western countries (so admits
this surgeon-author). We pay too much for drugs, implants, and
procedures, but in this new era of cost-consciousness and outcomes
tracking, Americans will see an improvement in “getting what we pay for.”
However, there is simply no other place in the world where most
economists, actuarial scientists, policy-makers, and physicians
themselves, would want to be cared for when suffering from a heart attack,
cancer, or trauma, but improved cost-control initiatives will need to be
nurtured.
Understanding the genesis of the FDA and Medicare is essential to
understanding the “perfect storm” in the explosion of implants. Improved
materials sciences, the discovery of antibiotics, the supervision of
implants by the FDA, the government-facilitated launch of thousands of
new hospitals following World War II, the invention of health insurance
and the formation of Medicare all coalesced within a few decades. Patients
needed health insurance to pay for the new expensive operations;
hospitals, physicians, and implant manufacturers needed a reliable flow of
insured patients. In 1965, who could have guessed, in their wildest dreams,
what was about to happen? Of course, Medicare costs have always
exceeded budgeted predictions, but Wilbur Mills and his colleagues
cannot be blamed for not reading the tea leaves when the three-layered
cake was made. Revolutions are tricky things to predict.