What Does HIAA Mean?

Free Business Insurance Quote Click Here
What does HIAA mean? Read on to discover the definition & meaning of the term HIAA - to help you better understand the language used in insurance policies.

HIAA

HIAA

Health Insurance Association of America was a group composed of insurance companies that sold health insurance. It lobbied for reforms for the industry. It merged with American Association of Health Plans to form America's Health Insurance Plans in 2004.

We hope the you have a better understanding of the meaning of HIAA.

A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z


Free Business Insurance Quote Click Here