If women aren't taught how to invest in a man or I should men invest in women?

There was a time when going up as a young woman you were taught how to take care of a home and cater to your man. And the man was taught that it's his duty to protect and provide for his family.

Nowadays, modern woman work and want to take on the same role as a man but they lack the skills to make a house of home. So isn't it if inevitable that we just have a hookup culture and marriages on the decline?
If women aren't taught how to invest in a man or I should men invest in women?
Post Opinion