What does HIAA mean? Read on to discover the definition & meaning of the term HIAA - to help you better understand the language used in insurance policies.
Health Insurance Association of America was a group composed of insurance companies that sold health insurance. It lobbied for reforms for the industry. It merged with American Association of Health Plans to form America's Health Insurance Plans in 2004.
We hope the you have a better understanding of the meaning of HIAA.