Why do people insist on saying women working is only a modern idea (feminist at fault) when historically women have worked one way or another?

Historically women have always worked in one way or another.

Farmers wives cured hides, baked pastries and sold produce at market
Sailors wives made & sold necklaces, etc. out of whatever their husbands brought back
Miners wives made & sold curtains, bedsheets and many other items.
And many other roles

Above has been going on for centuries. Herbalists & wise women - who compensated for the lack of readily available doctors - have been around for a very long long time.

During times like World War I & World War II countless women took to factory and other "man" jobs to compensate for the lack of men either in the USA, Canada or UK as example

This is all historical facts.

Taught pretty much in varying degrees throughout high school & college so very bluntly put people that don't know this didn't graduate minimal basic schooling in America at least.

.
So many blame feminism for a "change" in women working because they lack basic education & simple intelligence. What feminism changed was equal oppurunity to apply for education & jobs... nothing else... feminism didn't magically mean women could now work.

Feminism after all was started by "golddiggers" that were bored of their do-little lives married to wealthy or well to do men.

There is not one poor women in the original feminist ranks. Google it, educate yourself.

So to such women of course working was a novelty whereas their poor neighbors wives, sisters & daughters had always been working.

Why do people insist on saying women working is only a modern idea (feminist at fault) when historically women have worked one way or another?
Post Opinion