Why do people who go to college often use that as a means to make others believe they know more than them?

Like if you’re debating with someone who went to school for something specific like psychology and you guys are speaking on depression. You’re saying depression doesn’t have to be permanent and the one who went to school for psychology is saying it is permanent because their books and studies in school told them so. Which, to them, that means it makes it a fact. Basically, disregarding real life facts where people have overcame depression.

Why do people who go to college often use that as a means to make others believe they know more than them?
Post Opinion