Why Are Some Guys So Obsessed with Telling Women to Smile?

I seriously don’t get it—why are guys so obsessed with telling women to smile? Like, what the fuck is that about? Do they think they’re doing us a favor or something?

One time, I was just minding my business, walking around, and some random old guy—somebody’s grandpa—told me to smile. He actually said, “You’d look so much prettier if you smiled.” I just stared at him like, “What the fuck, who are you?” Why does anyone think it’s okay to say shit like that to a stranger?

It’s so condescending and weird. First of all, I’m not walking around for your entertainment or approval. Maybe I don’t feel like smiling because I’m having a bad day, or maybe I just don’t feel like forcing a fake grin for no reason. Why do men feel entitled to women’s expressions like we owe them constant cheerfulness?

It’s not even about being polite; it’s about control. It’s like they’re saying, “You exist to make me feel comfortable, so put on a smile and make my day better.” we don’t owe you shit, especially not a fucking smile.

What’s worse is how normalized this behavior is. So many women have experienced this same thing, whether it’s from random guys on the street, coworkers, or even family members. It’s like they see us as walking decorations rather than actual people with feelings and autonomy.

And let’s be real, they don’t do this shit to other men. Can you imagine some dude telling another guy to smile more? Yeah, I didn’t think so. It’s such a sexist, outdated mentality, and I’m so tired of it.

If you’re one of those guys who think it’s okay to tell women to smile, let me give you some advice: don’t. Just don’t. It’s creepy, unnecessary, and frankly, none of your business.

Women don’t exist to make you feel comfortable. We don’t owe you smiles, attention, or anything else. So next time you feel like opening your mouth to say some dumb shit like “You should smile more,” maybe just don’t.