it can really feel like an injustice at times, feel like a burden, it's probably one of the main things i have always hated about being born male. I've always hated this gender role so damn much, it makes me feel like fighting someone or physically hurting someone out of my frustration, anger, rage, that this gender role always or usually falls on the guys shoulders, we always have to be the ones to initiate a relationship. Always hate how life, society, expects guys to be the confident ones. If a guy tells me to Man up or grow a pair, it makes me feel like even killing him, of course i won't do that, because i'm aware of the legal consequences involved, it's just i hate how us guys are expected to be the confident assertive ones, unfortuneately i don't see this gender norm ever going away.
Most Helpful Guy
Ya tell me about it, I've always hated it as well0