What would you think if you saw someone wearing a thong bikini at the beach? Would you think it was trashy or would you not even care? I think its pretty common in other parts of the world for women to wear thong bikinis, why not the U. S? Is it just too revealing to most people?
Most Helpful Guy
I think america just has a warped view on sexuality, and as a result, something like this is considered too revealing.
it comes across best in tv and movies, where its perfectly fine to watch someone get their head caved in with a baseball bat, or shot in the face, however you can't see penis's or vagina's without disclaimers and or r ratings.1