What do you feel when you hear the word ‘feminism’? At some point the word ‘feminism’ has turned into a negative word. It seems that women’s right has improved a lot compared to old days and some people say that now there are reverse discrimination toward guys in result of protecting women’s right. Is it really though? My friends started to get a job and women get paid less then men when both are new employee and women have less opportunity to be promoted. In company’s perspective, they don’t want to invest in women employee since they are more likely to leave when they get pregnant. But I still think that it’s unfair and it should be that way that they give more opportunity to those who worked hard, long and committed to their company no matter they are women or men.
Back to our issue. In Saudi Arabia, there seems to be a movement to improve women’s right. People who are against to women driving a car say that, it’s because men are supposed to protect and take care of women. Driving a car is dangerous so by not letting women drive, men are taking care of their women. However, I strongly think that they should let women decide. Saying that they are protecting but ignoring women’s need is not really taking care.
I personally thought that feminism has been considered among men as a bad word because some of them feel threatened thinking giving women more right means being invaded privilege that they have. And I think the same thing is happening in Saudi Arabia. The cultural background is men should protect their women, but the authority they get from that idea would be something that they don’t want to give up.
댓글 없음:
댓글 쓰기