To assume American football 🏈 is a male dominated sport, is that a fact or a stereotype?

I’m not saying women aren’t allowed to play sports. I just don’t think men and women shouldn’t ever play contact sports together. Reason is, I believe men are more aggressive and stronger then women. I’m not trying to sound sexist. I believe it’s a fact.
To assume American football 🏈 is a male dominated sport, is that a fact or a stereotype?
Post Opinion