For the record, feminism, by definition, is the belief that men and women should have equal rights and opportunities. It is the theory of the political, economic and social equality of the sexes. I started questioning gender-based assumptions a long time ago.
When I was eight, I was confused about being called 'bossy' because I wanted to direct the plays that we would put on for our parents. But the boys were not. When at 14, I started to be sexualised by certain elements of the media, when at 15, my girlfriends started dropping out of their beloved sports teams, because they didn't want to appear 'muscle-y,' when at 18, my males friends were unable to express their feelings, I decided that I was a feminist. And this seems uncomplicated to me. But my recent research has shown me that feminism has become an unpopular word.