During the summer, I swear I look better... Not just because of the revealing clothes, but I think my skin looks better somehow... During the fall and winter, it seems like I look worse somehow.
Does anybody experience the same thing? Do colder seasons change your physical appearance?
Most Helpful Girl
perhaps in the Autumn and Winter, you are wearing the wrong colors...1