I never wear heels. I mean, never. They are banned from my wardrobe. No big deal, right? Think again. Turns out that there is this underlying cultural norm that we American women are taught from a very young age: heels are empowering. Want to look good? Wear your heels. Want to look professional? Wear your … Continue reading When I stopped wearing heels.