Considering that differences among sexes and genders is one of the most fundamental aspects of biology on earth, something most every species experiences, it seems crucial that we wrest control of all things gender from the feminist movement. What is the likelihood that this one ideology, which is so politically driven, really does have the end all be all on everything related to gender? That 'the patriarchy' invented gender roles in order to oppress women. The feminist monopoly must end.

More Posts