If someone hurt you, do you tell them?
If you're sore about something someone did to you, do you feel like you have to tell them in order to have closure? Do you feel like it's more important to tell them how you feel or to work through everything yourself without telling them? Sorry if this might be a repeat question, it wouldn't surprise me if it was. I'm just curious because I am so angry at someone but I'm hesitating to express my anger towards him even though he's important to me. It won't change everything that's happened, but there are so many things I wish I had said and now it feels like it's going to flood the gates if I can't tell him and come to some resolution about it. I don't think he would have a good response either.
What's Your Opinion?
What Guys Said 0
Be the first guy to share an opinion and earn 1 extra Xper Point!
What Girls Said 1
I always want to but I don't. I want to come across as the strong woman that I am so telling someone that they hurt me is really hard for me. However, telling someone they hurt you should never be used in any kind of manipulative way, such as trying to make a person feel bad. There should never be anger. Try to wait to express your hurt feelings until you have some distance. It's much more effective.