Are hospitals honest and safe anymore?

I mean... like, recently i discovered dark secrets regarding the things that goes down at hospitals, where the all doctors have a target (given by hospital management) to reach, and so, they'll prescribe tests for patients even when it's not needed.

The patient is the one suffering and have to pay the bills for it.

Also, in the field of medicine, if someone finds a cure for cancer (just an example), the BIG PLAYERS who are using the Cancer amd other diseases as business will not let this cure be released or known to the world. Instead they'd kill that man who found the cure, and bury all his research, just so that these so called BIG PLAYERS can have their business and such.

Such things makes me so darn scared to even trust a doc from a hospital.

What do you guys think?
Are hospitals honest and safe anymore?
Are hospitals honest and safe anymore?
Add Opinion