American Dictionary
Definition feminism in American English
feminism noun
[uncountable] noun
/ˈfemɪˌnɪzəm/
1
the belief that women should have the same rights and opportunities as men
Synonyms and related words
