American Dictionary

Define dictionary terms easily in our powerful online English dictionary.

Definition reiki in American English

reiki noun

[uncountable]noun
/ˈreɪki/
1

a medical treatment from Japan in which someone puts their hands on you so that energy can pass through them to you and so make a part of your body healthy

Synonyms and related words
Definition reiki in American English