I'm assuming by Western Medicine you mean proven, scientifically valid, peer-reviewed, actual medicine?
What some people like to call "Western medicine" is really just "scientific medicine." Everything else is either untested, or has failed the tests. Evidence-based medicine is the only way to go.
If there were something in 'Holistic' medicine that was demonstrably effective, it would become part of 'Western' medicine and I would go with that.
Holistic medicine is a scam that relies on the placebo effect. No scientific studies have proven it works. The "trials" (which are not required in many places) are not perform carefully and are biased. The most shocking fact is that the US enforces no requirements and the FDA has no power to intervene in the holistic drug market except if there is proof of a danger to health. In Canada (where I live), all you have to do to receive approval is to find anything that states you product works. According to my pharmacology prof, this could includes an article in National Geographics stating that a tribe in Papua uses a particular leaf to treat stomach ache.