American Dictionary

Define dictionary terms easily in our powerful online English dictionary.

Searching for...

No matching words found

Try a different search term or browse the dictionary

Definition informed consent in American English

informed consent noun

[uncountable] noun
1

permission that you give to doctors to do something to you, after they have explained the risks involved

Synonyms and related words
1.1

permission that you give for something to be done that you know could be dangerous for you

Synonyms and related words
Definition informed consent in American English