OK so we all know that reality shows are a huge hit in America. I feel like shows like the bachelorette and real housewives are portraying a twisted image of how life, dating, and marriage should be. Am I wrong to assume that A LOT of women think that these are realistic ways of living?
Most Helpful Guy
I think reality shows are ruining all of us and our kids. I used to grow to shows like Family Matters ("did I do that". lol), Cosby Show, Full House (don't judge me), etc. Guess what, they all revolved around family values and having a solid relationship with friends and family.
Now I'm watching chicks chase men for money or fame. What happen to good old fashion love? What a shame.