Body Positive Books That Don't Feed Into Diet Culture or Toxic Habits


If you purchase an independently reviewed product or service through a link on our website, SheKnows may receive an affiliate commission.

As body positivity, body neutrality and fat liberation make their way into the cultural lexicon of instagram captions and mainstream magazines, it’s really easy to see bits of diet culture sneak their way in (disguised as “wellness”) and continue to focus on weight loss and shrinking a body more than how to properly nourish the person who lives in said body. Not cool, we hate to see it.

That’s why it’s great that we’re seeing more and more material that kicks aside those pre-conceived notions about health and fitness and instead focus on the ways we can actually radically love our bodies and make the world more inclusive for every kind of body. If your bookshelf is offering slim pickings on books that make you feel good and empowered about the skin you’re in, look no further, we’ve got a grown-up summer reading list available to help you consider all the ways your body deserves a little bit more care and kindness.

Read on for books about nutrition, self-love, wellness and just rocking your best life in your body no matter what society’s obsession with thinness and weight loss tries to tell you.

A version of this story was published August 2021.

Source: Read Full Article