Dentists are doctors who focus on oral health, which means that they focus on the mouth, teeth, and gums. Dentists diagnose and treat problems within the mouth; this can include removing decay, filling cavities, and repairing problems with teeth. They also give patients advice on how to better take care of their mouth and teeth.