If someone hurt you, do you tell them?
If you're sore about something someone did to you, do you feel like you have to tell them in order to have closure? Do you feel like it's more important to tell them how you feel or to work through everything yourself without telling them?
Sorry if this might be a repeat question, it wouldn't surprise me if it was. I'm just curious because I am so angry at someone but I'm hesitating to express my anger towards him even though he's important to me. It won't change everything that's happened, but there are so many things I wish I had said and now it feels like it's going to flood the gates if I can't tell him and come to some resolution about it. I don't think he would have a good response either.
What's Your Opinion?