Bookmark and Share

What is Feminism?

Feminism is the belief in and support of the social, economic, political and educational equality of all genders. Feminists concern themselves not only with specific issues of violence against women, but also with the broader issues of education, reproductive health and rights, childcare, economic opportunity and pay equity, and the intersecting issues of gender, race, sexually transmitted disease, gender identity, and sexual orientation in today's society.