Have you ever had someone tell you they love you and actually believe they do?

Not just romantically in any relationship I personally have never had somone love me and actually mean it. The phrase means nothing to me anymore just something hollow and meaningless to be tossed around. In my opinion words are cheap so don't tell me you love me prove it actions are stronger than words how dare you tell me you love me as you actively hurt me as though just those words should fix the damage you caused

Have you ever had someone tell you they love you and actually believe they do?
Post Opinion