Is the reality that females really are the better looking of the two genders or is it more complciated than that? Does it have more to do with the fact that we’ve historically lived in patriarchal societies where horny testosterone driven men have made sexual objects of females? Thus perhaps even females grow up believing they are the fairer sex. Further justified by the fact that they can use it to their advantage over men.
But does any of that really mean they are the fairer sex?
How does this pan out in the modern world where women are becoming more take-charge and aggressive in certian ways? Especially as they get older. Will we end up seeing more objectifying of the male body as a result?
I think back to ancient cultures like Greece and Rome where the naked male body was sculpted and painted and bisexuality was acceptable. I wonder, were women considered the fairer sex back then or were things more even compared to the off centered balance of today?