The Affordable Care Act mandates that health insurance companies are required to provide certain levels of coverage with every plan. When it was first implemented, it required Americans to carry health insurance just like they carry car insurance. However, as of 2019, health insurance is no longer required at the federal level. Some states may still require it, so it’s important to check what the current mandate is based on where you live.
The Affordable Care Act also established marketplaces where people can review and compare healthcare plans to find the one that is right for them.