I’ve always thought of myself as someone who is in touch with their body. The cancer diagnosis made me realize just how untrue that is.
And I’m kind of pissed off about it. I think the society I live in (the US) doesn’t support ownership over and connection to our bodies. Or at least me to my body. Everywhere, the messaging is that how we see our bodies is wrong, that we constantly have to change them, that they are for other people, products, not for ourselves.
Cancer has felt like my body’s ultimate betrayal of itself. It is my own cells, afterall, growing amok. Maybe it was just attention seeking, my body crying out to my brain to pay attention to it, to stop letting everyone else (advertisers, bosses, patriarchy, gazers, etc…) control how I use and view my body.
So is it possible that cancer is trying to teaching me to reclaim my own body? Maybe.