What does Health Certificate mean? Read on to discover the definition & meaning of the term Health Certificate - to help you better understand the language used in insurance policies.
A health certificate is an official document that describes the health status of an individual. These documents must be signed by a health professional in order to be legitimate. In the context of insurance, health certificates can be used in both life insurance and health insurance.
We hope the you have a better understanding of the meaning of Health Certificate.