Dental insurance, like health insurance, is coverage for individuals to protect them against dental costs. Dental insurance usually goes hand-in-hand with health insurance, with most people in the United States receiving it included in their health insurance plan from their employer. Along with receiving dental insurance from your employer, there are ways to receive dental insurance through resellers and companies for individuals and families; although this way tends to be too expensive for most people.