This Take is an attempt to get to the bottom of whether there is a cultural divide on how people view the human body between the English-speaking world (the UK, Australia, Canada, New Zealand, South Africa and the USA) on one hand, and Europe, Latin America etc on the other.
I'm making it specific to this, because toplessness, nude beaches etc are not mainstream, and show no signs of ever becoming mainstream, whereas thong bikinis arguably are, although are still controversial with some people, and in some places.
Leaving aside the basic fact that not everybody is comfortable with the idea of showing off, there seems to be a big culture divide within the western world on this issue.
On three occasions earlier this year, I was at beaches around the Mediterranean, when it was warm enough to swim, yet out of season. There were very few foreigners there, and these beaches were recommended by locals in the know.
Because it was mainly during the working week, the beachgoers were mainly female, and it was pleasing to see the amount of ladies, who had opted for thong bikinis.
There was a wide range of ages, from late teens to early 40s, and probably most, if not all of these ladies had children with them.
It struck me as quite culturally different to what you'd expect to see on a beach mainly frequented by English-speakers. As I mentioned earlier, virtually all of these ladies were local.
Do you think this is harmful to their kids, or something that we should also adopt, namely a more carefree attitude to the human body, particularly one that has had children? Do you think it would be liberating if our society wasn't so judgmental about this kind of thing?
And do you believe that this is something that only a woman with a certain type of mindset can wear, and if so, what is that mindset?