Unsurprisingly, the mainstream media’s only interest in all of it appears to be that women may be less likely to come forward with accusations against men in the future. Not a word about male victims of female abuse feeling more comfortable seeking help despite the fact that studies show men are often the victims of domestic abuse at the hands of women but most don't feel comfortable doing anything about it.
Are we as a society finally beginning to turn the corner from our overwhelmingly gynocentric views of gender issues?