American Dictionary
Definition informed consent in American English
informed consent noun
[uncountable] noun
1
permission that you give to doctors to do something to you, after they have explained the risks involved
Synonyms and related words
1.1
permission that you give for something to be done that you know could be dangerous for you
Synonyms and related words
