I understood what the goal of it during the 1900s but i do not understand it now. I don't see why feminism is needed in America and many other power house countries. Maybe in countries that are poor, very religious or something like that. What do women want now other then power it seems? What could possibly want in America that you couldn't gain from working?