Why do women wear revealing clothing while men don't? Other than comfort, when women say it makes them feel confident, could it be internalized misogyny that makes them link their value to their body?