English Dictionary
Definition western medicine in British English
western medicine noun
[uncountable] noun
1
the type of medical treatment that is the most popular in North America and Western European countries, based on the use of drugs and surgery to treat symptoms (=signs of illness). In such countries, other types of medical treatment are called alternative medicine or complementary medicine.
Synonyms and related words
