i am christian and i was never super religious but i really am trying to get closer to god so i started reading the bible, but ironically, it just made me mad! the way women are seen as objects and as less important then men. i understand that this was a loooong time ago, way before things started becoming more equal, but it all just made me so mad that i stopped reading. does god really expect us to act like men are better? because that is just not happening with me there is noooo way i'd ever cater to a man or let him tell me what to do/how to act. is that a sin though?
Most Helpful Guy
Well, the bible was written a long time ago and religion comes from a long time ago. We have to take in consideration the historical context, in the past, when those books were written, this is how society worked, so it doesn't surprise me at all that the bible says these things. However, times have changed and today this is unacceptable, so you should "adapt" your religion to the current historical context in my opinion.1