Member-only story

What Feminists Really Want for Men

Amber Fraley
3 min readFeb 12, 2022
Image by Claudio_Scott from Pixabay

It’s a shame some people want to twist what feminists actually want into some ridiculous, ball-snipping dystopian nightmare. We get called feminazis and worse, which is hilarious, since the vast majority of us like men, live with men, many of us have male children, and we genuinely want the men in our lives to enjoy the benefits of true equality.

Are there a few hardcore feminists out there who’d like to see men on all fours wearing dog collars? Sure. But they are few and far between, and there’s never been a female-dominated society on the face of the planet since the beginning of time, that we know of. (I’ve never understood this irrational fear. Men own most of the guns in the world and commit most of the violent crimes. The thought of women going ham and somehow vanquishing the world of men is truly paranoid delusion.)

While woman-haters male and female insist feminists want world domination or male subjugation or some other ludicrous notion, the truth is far more boring:

Feminists want men to have paid time off to spend with your newborn.

We want you to have more time to spend with your family.

We want you to be able to work for the most qualified boss instead of having to kiss the ass of some incompetent white guy who got where he is not because he’s qualified, but because he

--

--

Amber Fraley
Amber Fraley

Written by Amber Fraley

Writing about abortion rights, mental illness, trauma, narcissistic abuse & survival, politics. Journalist, novelist, wife, mom, Kansan, repro rights activist.

Responses (21)