feminism
The belief that women and men should be treated equally.
Feminism is the belief that women and men should have equal rights, opportunities, and treatment in society. A feminist is someone who supports this idea and works to make it real.
Throughout history, women in many societies faced legal restrictions that men didn't: they couldn't vote, own property, attend universities, or choose many careers. Feminists campaigned to change these laws. In America, the women's suffrage movement fought for decades until women won the right to vote in 1920. Later feminists worked to open more careers to women, from becoming doctors and lawyers to astronauts and engineers.
Feminism recognizes that while laws have changed dramatically, some unfair treatment still exists. For example, if a teacher always calls on boys to answer math questions but only asks girls to help with art projects, that's the kind of assumption feminists challenge. They believe both girls and boys should get equal chances to pursue whatever interests them, whether that's science, sports, writing, or anything else.
The word sometimes causes confusion because people disagree about what it means. Some think it means women should be treated better than men, but that misunderstands the core idea. Feminism means equal: everyone deserves the same opportunities regardless of whether they're male, female, or nonbinary. When a coach gives both the girls' and boys' teams equal practice time and equipment, that coach is applying feminist principles, even if they never use the word.