While perusing the health news segment in a popular magazine recently, I found myself cringing at the headlines. Insurance, radiation, prescription drugs, and myriad gloom-and-doom medical findings—the topics sent shivers down my spine! How, I wondered, is any of this “health” news? Even as our collective consciousness is blooming with enthusiasm for more nutritious food, safer homes, and healthier habits, it seems that an alarming number of people are still outsourcing the care of their bodies to the medical industry.
In large part, it’s a matter of convenience. Instead of investing time and effort into restoring one’s natural health, it’s easier to ask a doctor to prescribe one (or more) of the many pills touted as quick cures. I think there is also a significant fear factor involved here. Over the past century or so, modern medicine has managed to convince us that physicians hold the key to creating wellness, and if the average Jane wants access to her own health, she must come crawling into the doctor’s office with her pocketbook open for business.
This isn’t to say that medicine has not served a vitally important role in engendering health in our society. Indeed, it has its place and has helped countless people live healthier, happier lives. But the shame of the matter is that, like so many other commercial ventures, medicine has become big business to the extent that people have virtually—and literally—become addicted to the system and its often toxic byproducts. We’re hearing about more lawsuits filed for drugs gone awry. We understand that overuse of antibiotics causes dangerous bacterial resistance. And, heaven knows, we’ve read the volumes of side effects and warnings that accompany medications. Yet, it’s almost as if our expanding education engenders more fear, and less confidence, about our course of action.