AdobeStock_85750966

Being from the United States, there is a certain way in which I view medicine, in that it is an important part of our health and something that should be focused on at all times. What I mean by this, is that in terms of what the word refers to as Western medicine, I believe that it is the ultimate way in which to care for our society.

Western medicine, which revolves highly around the use of pharmaceutical drugs and surgery, is one that is highly debated. Although many believe that there are some definite advantages to it, others believe in the body’s natural healing power and that of the earth.

Voodoo or not?

What is commonly referred to as Eastern medicine involves becoming more in-tune with the body and the world and using natural remedies to stave off illness. And while many believe that this is the best way to care for the body, others believe that Western medicine is the better option.

In essence, what you get is two sides to the same coin. Healthcare is an important part of our society and culture, and it differs depending on where in the world you are. One thing that we can take away from these two divergent forms of medicine is that both have their merits. We lived without modern medicine for a long time, and the earth was still able to sustain our existence. The only thing we can really know is that both Eastern and Western medicine has its advantages.

Medicine in culture

Leave a Reply

Your email address will not be published. Required fields are marked *