American Dictionary
Definition fascism in American English
fascism noun
[uncountable] noun
/ˈfæˌʃɪzəm/
1
a very right-wing political system in which the government is very powerful and controls the society and the economy completely, not allowing any opposition. Fascism was practiced in Italy and Germany in the 1930s and 40s.
Synonyms and related words
