Having said that, I'm not the only guy who wonders why a lot of girls prefer wearing bikinis. Most women I know wouldn't want a man to see them in their underwear unless it's someone they've chosen to be intimate with. Yet, many women seem perfectly comfortable (if not happy) showing their skin in public at the beach or pool.
Not to mention the fact that some bikini styles are more revealing than others. I've seen girls at the beach wearing what is essentially a tiny bra and a G-string. It doesn't seem remotely practical, and it shows quite a bit more skin than many kinds of women's underwear.
So I mean it as an honest question. Ladies, why do you wear bikinis (if you do)? How do they make you feel?
AI Bot Choice
Superb Opinion