What does FDIC mean? Read on to discover the definition & meaning of the term FDIC - to help you better understand the language used in insurance policies.
The Federal Deposit Insurance Corporation (FDIC) is a United States government corporation created in 1933. Its main mission is to maintain the public's confidence in the country's financial system, and its primary role is to insure deposits and protect depositors against bank failure.
We hope the you have a better understanding of the meaning of FDIC.