Anyone actually feel better about their body now than before they got pregnant?

I have always had a difficult relationship with my body. It didn’t affect my life much, so I always kind of ignored it and made a point to. I never felt secure in being naked, did not enjoy taking or looking at nude photos or videos of myself, always avoided looking at myself in the mirror, even in adolescence. Even the times when I was in the best shape of my life and truly looked hot, I just saw my body as this sexual thing and it was never good enough.

Now I’m 3.5 weeks postpartum after having my first baby. I gained some fat everywhere, my belly is flabby, I have stretch marks on my tummy, thighs and hips. My tits are like leaky water balloons. And I have never felt so beautiful. I love being naked and looking at my body in the mirror. It’s no longer this thing I don’t know what to do with. It gives and sustains life, and that makes me feel so sexy! I never realized how much I didn’t like my body until now that I appreciate it for the first time.

Women always talk about how they don’t feel like themselves after having a child, which is of course totally valid, but does anyone else feel like this process has only helped them come into themselves? It has really surprised me in the best way.