This is another post I originally intended to be just a question but the details got too long for it, so I decided to put it in a Take instead again.
I'm not "hating" white women, it is just ironic to me. When you see the marches, all the books and articles written by feminists, and see them on news programs, the vast majority of them are white women. Even the majority of white women journalists have feminist allegiances.
It's simply ironic to me because white women have been the least oppressed of any female demographic. They have actually been given a lot of favor above other races of women and even men. White women were voting in peace long before blacks even though blacks were technically given the right to vote first - which white feminists actually resented and scorned blacks for. White women have held more seats in Congress over the years than both black men and women. White women have also advanced to being company presidents and CEOs sooner and more often than blacks or other races. White women also had an easier time getting into colleges than blacks and other races. And white women are the least convicted demographic when it comes to crime, as many are also let off the hook more often.
Yes, we know there are some black feminists. Yes, we do see feminism in other nations, and an increase of it in countries like India and the Middle East, although it seems to be mostly prevalent in the young circles and especially in universities. Yet overall women in other cultures - including black women - who have faced a lot more difficulty than white women, have not felt that feminism is all that necessary. They definitely do think that women should be treated better and regarded more, but still don't think feminism should be an actual thing.
Years ago in the early 2000s, when I was a young guy writing my own "gender book" and studying both feminism and the 'men's rights movement,' I found out a lot of bologna about feminism. Make no mistake, I'm no fan of MRA either - which I wrote about here some time back: How Men's Rights and the Men's Rights Movement Are A Lie , but I learned a lot of bullshit about white feminism. So it just always baffles me that this group who has faced the least oppression and difficulty, has felt the need to cry about oppression the most. Why is this? Is it still tied to some form of white privelage? Or is it just still a mindset of princess entitlement period? It also seems to be that the issue of casual sex is such a big deal to them. Is it because white feminists know that they've already achieved a lot and don't really have anything else to complain about? Or is it because they simply have a selfish desire for the rights to have casual sex?
So I do ask, as the title says: why does feminism seem to matter to white women the most?