Why in America specifically, are Christians the only religion that really try to force their religion down your throat?

I said the US specifically because since the US is so diverse you really see a bit of what the whole world has to offer. So now when it comes to religion I can’t think of anyone who tries as hard to force their religion into others as Christian’s (and ironically they wonder why they’re so dislikes) I’ve never had a Muslim quote scriptures from the Qu’ran at me, not even Jews with the Torah despite believing in the same god. So what makes Christians so rude and militant with theirs? I’m curious if maybe it has to do with things like how the kings and queens of England would kill people over not being a certain religion and so it’s just always been the norm for Christian’s to view themselves as having a right to do what they do
Why in America specifically, are Christians the only religion that really try to force their religion down your throat?
Post Opinion