health care
noun(health care)
efforts made to maintain or restore physical, mental, or emotional well-being especially by trained and licensed professionals —usually hyphenated when used attributively
health care
noun(health care)
health care provided by a medical professional (such as a general practitioner, pediatrician, or nurse) with whom a patient has initial contact and by whom the patient may be referred to a specialist —often used before another noun —called also primary health care