American Dictionary
Definition the FTC in American English
the FTC
/ˌef ti ˈsi/
1
the Federal Trade Commission: a U.S. government department that makes companies do business in a fair way
Synonyms and related words
