Sex and gender are NOT the same thing.
The world is organized as a patriarchy and because of that women suffer in a number of ways.
We learn to be male or female through socialization.
There are a variety of theories that help us better understand gender and gender roles.
Feminism is a social movement to eliminate patriarchy and usher in gender equality.
Any questions?