Are men taken more seriously in the workplace and when they speak or is it just all in my head?
As someone who grew up female, I would be told to cross my legs when I would sit. Put my legs together to not take up too much space (even if I felt uncomfortable having my legs pushed up against each other due to the constricted blood flow). I was forced to go for eyebrow threading/waxing starting from the early age of just 14.. I just wanted to be a kid and chill. Guys in school would harass me and block me in the hallway. The girls would pick on me for not having my hair perfect or not having my eyebrows done.
Now, in the workplace, I feel men are just overall treated better. They are not made to take meeting notes, set up meetings, set up office parties, facilitate things.. They are given meaningful work that is interesting. Now, I know a job is a job and it isn't supposed to be fun. It's work and everyone needs to work hard. I just feel that women are given more of the 'office household work'.