posted March 04, 2015 06:05 AM
Almost every single women I have ever come across in Western societies is a feminist. What bugs me is the misuse of the term. Feminists don't want to be men.
From wiki:
"Feminist movements have and continue to campaign for many women's rights, including the right to vote, to hold public office, to work, to fair wages or equal pay, to own property, to education, to enter contracts, to have equal rights within marriage, and to have maternity leave. Feminists have also worked to promote bodily autonomy and integrity, and to protect women and girls from rape, sexual harassment, and domestic violence."
Every western women I know believes that women have the right to vote, the right to go to school and study what they want, the right to own property, the right to inherit, and right to execute contracts, have bank accounts and not be limited in their life choice based on their gender.
They also believe that they have the right to determine their own lives and not have them dictated to by a male family member.
The only women I've encountered who do not believe those things are some very fundamentalist religious women, and even then...
Feminism is a political view and has nothing do with if you want the man of the house to fix things while the woman cooks and cleans or if you want to be sex slave or vice versa.
Again, feminists do not think that woman and men are exactly the same. They think that women should have the same RIGHTS as men.
Does that mean their aren't feminists who think that gender roles as a whole are wrong and there shouldn't be ANY differences between men and women? Sure. Please note the word "Feminist Movements". There are extreme examples. Every theory moves within degrees and their are certainly "fringe" and extreme feminists.
But again, almost EVERY western women I've ever met believe in the principles above.