The other day I was on YouTube and I was looking at some videos about Germany. I was bored and started looking at the comment section and people started saying how America sucks because we don’t have universal healthcare; you always hear this complain, especially by younger people. You hear this a lot right now, especially because of the elections. I really don’t understand why people make a big deal about this because having insurance doesn’t seem difficult. I’ve had insurance all my life. When I was a child, I had my parents insurance and as an adult, my employer has always provided insurance. I feel like you have to try really hard to no be insured.