I've noticed that people in general (but women mostly - maybe because I usually notice women's clothing more) wear brighter colors during the Summer, while Winter is completely colorless clothing and makeup wise.
Is this because of the fashion industry - putting out more colors during the Summer months, while sticking to blacks and navy colors during the Winter?
Or is it the mood you're in during that time?
Most Helpful Girl
I think a lot of it is fashion-dictated where the norm is dark, muted colors that go with the fall colors of orange, red, gold leaves and the red/green combo of Christmas.
But then, I have also noticed that people who live in Mediterranean, tropical climates tend to bright colors. Maybe it's because they have a year-round display of colors due to the flowers, the sea, etc and this somehow influences them.
Maybe along with the conscious fashion dictate, there is a subconscious factor where we respond to the colors in the environment.0