I see that American women always tell boys that girls are better than them, and try to feminize the boys as best as they can. If I were a child in America in this era I would try my best to avoid people at all costs to protect myself, because Women are always targeting the boys am I wrong?
Superb Opinion
Society (read women) generally feel disdain for males today. That is clear.
What Guys Said
0
The only opinion from guys was selected the Most Helpful Opinion, but you can still contribute by sharing an opinion!
Click "Show More" for your mentions
Most Helpful Opinion(mho) Rate.
Learn more
Learn more
We're glad to see you liked this post.
You can also add your opinion below!