Dr. Andrew Weil draws upon thirty years of medical experience to speak to us about our health care, offering a clear picture of what went wrong with the American way of health to create the crisis in which our country is embroiled, as well as a passionate vision of how we can make it right.Dr. Weil goes to the root of the problem, showing how medical schools fail to give future doctors the education they need, how insurance companies have destroyed our opportunity to get excellent care, and how pharmaceutical companies have come to rule our lives. The solution involves nothing less than the creation of a completely new culture of health and medicine in this country, one that we can each start building today.