Vitamin D deficiency leads to all sorts of illnesses. In modern society when many of us white people work in an office out of the sun, it becomes necessary for us get out and get some sunshine just to stay healthy. While I live close to the beach, I don't go and sun bath. I refer to go run on the beach or work in my garden. Being out in nature and being active relaxes me and puts a smile on my face. Getting some sun makes me feel a lot better. Getting a tan help protect me from getting a sunburn the next time I'm out in the sun.
Most Helpful Opinions
Yea it looks like you are rich and have the time to go to the beach.
Yes, it is the fashion for many people. It all started with Coco Chanel in the early 20th century when she accidentally came back from a trip with a tan. Before then it was unthinkable for a high class European to acquire a tan, as it would imply they were toiling in the fields like the commoners. Now the media often portrays it as the beauty ideal.
Some people thinks it makes them look healthier but for others they tan because it accentuates their physical features, also it's an easy indicator to show they've been on holiday. It's not just white people who tan, you'd be surprised at how many people from other ethnic groups like to tan too.
I'm a redhead and I go to the beach with friends a lot they always lay down to tan, so I just don't want to be left out, I put a ton of sunblock on and pray that I don't get a sunburn.
Sometimes the warmth is nice ;)
Peace
Margaret
Tanned white skin has become a beauty ideal, that's why.
What Girls & Guys Said
Opinion
2Opinion
They like the way it looks and that they don't get sun burns as easily afterwards.
Funny isn't it? White people want to be dark and minorities want to be light. Crazy world we live in.
They want their skin to be more tanned.
A bit of tan looks healthier
Learn more
We're glad to see you liked this post.
You can also add your opinion below!
Most Helpful Opinions