I was always told growing up, to keep my hands and nails nice because one day when I start dating, my guy may often look at them as a sign of what type of woman I am. I'd also like to admit that I grew up with in a pretty old-fashioned Christian family. My uncle used to tell me a lot to keep my nails trimmed and my hands well oiled and soft. Guys hate hands with calluses lol. As an adult, I go nuts making sure my hands are nice and nails always groomed and polished and my boyfriend barely ever notixes unless he's holding my hand or I say "look at my new manicure" lol. Is this true guys?