doctor
A medical professional who treats illnesses and helps people heal.
The word doctor has two main meanings:
- A medical professional trained to diagnose illnesses and help people get better. When you're sick, you visit a doctor who examines you, figures out what's wrong, and recommends treatment. Doctors spend many years learning about the human body, diseases, and medicines. Some doctors are general practitioners who treat common problems, while others specialize: a cardiologist focuses on hearts, an orthopedist on bones, a pediatrician on children.
- Someone who has earned the highest academic degree in their field, called a doctorate or PhD. A doctor of physics might research black holes, while a doctor of history might study ancient civilizations. These doctors aren't medical professionals but experts who've completed original research and made new contributions to knowledge. Universities employ many professors with doctorates.
As a verb, doctor can also mean to tamper with something dishonestly, like doctoring a photograph to change what it shows, or doctoring evidence to mislead people. This negative meaning suggests altering something in a sneaky or deceptive way.