I understand they want more Americans to get the vaccine but is mandating it really necessary?

I am a university student I found out today that I would not be allowed on campus unless I got the vaccine. I am not an anti vaxxer or anti medicine I just don’t feel comfortable getting it. I need more time. I find this to be unfair. It infringes on our choices. Some colleges like Brown deny attending school because they are not doing remote classes so you must take a semester off. All of this because they don’t want the vaccine. Ridiculous. Thoughts?
I understand they want more Americans to get the vaccine but is mandating it really necessary?
Post Opinion