Am I right in thinking that If we men don't give the women right they won't be able to do anything?

Now I don't think we should take anyone's right away. And one shouldn't send our society back to the middle ages. But whatever right's women have is because good men have 'granted' those rights to them. If all men became bad and treated woman badly they wouldn't be able to do a thing.

Most powerful intelligence agencies are all male dominated.

Am I right in thinking that If we men dont give the women right they wont be able to do anything?
Am I right in thinking that If we men don't give the women right they won't be able to do anything?
Post Opinion