Jobs feel pointless now?

I’ve really been feeling this way since Covid times. Lots of people got laid off, a lot of people forced to comply with vaccines to keep their job, and it’s like all we do at jobs when it comes down to it is work to fulfill some kind of government agenda.

In my industry all it’s done is drive companies to the breaking point. I’m not getting paid enough, post-inflation… It’s like maybe without heavy government intervention, things would be better, but it has to have its hands in everything and it’s really making things difficult.

It’s like we can’t JUST have it good… there has to be a wrench thrown in at some point to screw things over. I think that’s one thing these last few years have really cemented into people’s brains. We aren’t the free market like we’ve been told we are. I’m sick of it… like, make things make sense. 🤷‍♂️

EDIT — and now, before I hit submit and it says ‘Education and Career’… of course, all the higher ups at work want you to do is go back to school, take more classes, get a better degree… and it’s like NO… all this company does is get itself into trouble by bending to the government, lie to make itself look good, and you want ME to be dumb enough to see through it and think that’s ok? Nah bro, I’m good… like maybe run your company better first and maybe I’d take your advice.

You CAN’T trust these people man!

Jobs feel pointless now?
Post Opinion