Do women in western nations have more rights than men?

Anonymous
This is not intended to be a debate on the wage gap, or other social and financial inequalities between men and women. Instead, I would like to gear the conversation towards our rights as human beings. There is no law that says women cannot receive the same pay as men. But there is a law that requires male conscription or eligibility for the military draft

Men also have no right to the life (or continuity of the biological processes that lead to life, depending on where you land on this other debate) of their offspring. Abortion is the sole right of the woman but then they are forced to provide child support even if they do not want or consented to having the child (they only consent to sex like the women not parenthood)

Women also have the right to genital integrity upon birth in (I believe) ALL western nations. However, men are subject to circumcisions, specifically in America.

I am not saying that women don't deserve these rights, or that there isn't valid reason behind them.

I am saying that women have more rights than men

but do you think women have more rights and do you think it needs to be more equal for men and how would you do that?
Do women in western nations have more rights than men?
6
1
Add Opinion