Originally posted by: Bechain_Bulbul
yes feminism by definition on paper is advocacy of equality of sexes but in reality, in society, what we see is radical feminism now a days. They believe that since women were dominated by the men now women got the right to dominate and oppress men in same way.
For example, In bollywood, hollywood, ott and tv also majority of the content we see in the name of women empowerment or women based shows showcase modern day women as drinking, smoking, cursing, involved in crime, f**king multiple men just to portray them as cool as or equivalent to men. This kind of behaviour was neither good for men nor it is for women. Nobody is talking about actually empowering women by showing them as career woman, independent, strong willed, straightforward.
Feminism, radical or otherwise, in paper or practice, is about the EQUALITY of gender and not the superiority of one gender over another. People who think otherwise don't understand the term feminism and should not be called feminists. It has become a common practice these days to make out feminism into something negative, something it is not.
Coming to movies and shows, there are many of them which ACTUALLY depict what true women empowerment is. Even in the type of movies you mentioned, it is the CHOICE of the women to do such things, like it is the choice of men, even if they have nothing to do with empowerment.
The thing is, feminism is not about the superiority of one gender over the other. Feminists don't aim to suppress men or dominate men in any way. All they want is equal rights for women. It is the problem of people if they can't understand this very simple point. Feminism stands against patriarchy and NOT men. So let's not spread misconceptions like feminists are against men or are trying to dominate them.
Edited by janecastle - 4 years ago