American Dictionary

Define dictionary terms easily in our powerful online English dictionary.

Searching for...

No matching words found

Try a different search term or browse the dictionary

Definition dermatology in American English

dermatology noun

[uncountable] noun
/ˌdɜrməˈtɑlədʒi/
1

the scientific study of skin diseases and the treatment of people who have them

Synonyms and related words
Definition dermatology in American English