What's something your parents never taught you that you wish they had?


1|0
3|7

Most Helpful Guy

  • They never told me women, esp. here in the US, generally treat sex as something they can use to get what they want from guys, or to get promotions at work, etc.

    Very few parents, I think, tell their kids the truth about this. Some mothers will, but only in the context of TEACHING their daughters how best to do this.

    2|1
    0|0

What Guys Said 6

  • That just because you get a girl pregnant doesn't mean you have to marry her

    2|0
    0|0
  • Playing guitar I had to teach myself! D:

    1|0
    0|0
  • Financial management

    1|0
    0|0
  • That females just want to hear her opinion but just in a deeper voice😂😝
    Sorry girl's I know it's not a funny joke

    2|0
    0|0
  • To care less what other people thought about you.

    0|0
    0|0
  • How to handle a relationship

    0|0
    0|0

What Girls Said 3

  • She didn't teach me enough about racism. It was never mentioned in our household and there was no overt racism where I grew up. Then a few years ago, I moved to a predominantly white area and got spat at, verbal and physical abuse within the first few days lol. My best friend; who is white, had moved with me and neither of us really understood what was going on. I had to learn through terrible experiences over the next few years and I definitely feel like my mother should've sat me down and prepared me for it. But I know now.

    1|0
    0|0
    • Wow that's really awful. Did someone really spit at you because of your skin color? What a piece of crap...

    • No, not someONE, but many people. Well in every instance they have been men; but women can be equally nasty in various other ways.

  • Trust your gut feeling. I always ignored it and my parents always taught me to think logically.

    1|0
    0|0
  • We never took vacation, so now I don't get everyone's obsession with it.

    1|0
    0|0
Loading...