ome Christian friends introduced me to Christianity, I kind of liked it and found it interesting, but after studying a little more about religion, I discovered that the bible says that women should be submissive to men. I don't like it, and I won't be submissive to any man.
I decided not to become a Christian because of this.
Why does the bible say this?