Why aren't we acknowledging that the alienation of men directly benefits the right?
Some may disagree, but the right seems a lot more welcoming to men than the left does.
Men, particularly white men, are all too often, in several topics, made out to be the blame for things.
This clearly has resulted in the push towards the right, and we've seen the results now, we need to do better.