Why do people have so much faith in vitamins and supplements?

Vitamins have never been scientifically proven to do anything so why do people take them? Even better why do people take supplement stuff from like GNC when none of the products have even been tested at all by the FDA because they're herbal and don't have to pass standards. I feel like people have blind hope in this stuff and buy into what companies and the media says too much.
Why do people have so much faith in vitamins and supplements?
Post Opinion