Is There Still a Need for Feminism in Modern Society?
I have read many blogs on the subject and have attempted to have conversations with my friends/family about the topic but some people have a warped image of feminism and some believe that it is irrelevant.
Yet 'slut shaming' is practically expected, the word 'girl' a common derogatory adjective. My friends and I are sexually harassed in the street and upon saying how I'd much rather a promising career and inner city flat to having children I am often met with confused looks; when expressing my desire for a PhD the phrase "that'll take up some prime baby-making years" has come up multiple times. My friends and I are scared to venture out into the streets at night for fear of being attacked. Then in this world of rape culture not being taken seriously or being insulted during the process. Misogyny is everywhere, from ignorant remarks to social expectations.
The website rookiemag.com has some brilliant articles, written first hand by young girls and some guys on feminism, I have tagged a talk given by a regular writer for the website. Also the blog whoneedsfeminism.tumblr.com had points from both genders.
I can understand where the opposing side is coming from. In Western society many restrictions set in place for women are demolished, I just want to know your thoughts on the subject.