When I was a child, I was naturally very tan. So tan, in fact, that people didn’t believe that I’m of completely Eastern European background. But after learning of some cancerous spots on my paternal grandmother’s skin and, ironically, after moving to Florida as a teen, I decided to take my skin in the other direction. I didn’t spend as much time in the sun, and I used sunscreen religiously. Today, I’m arguably palest person in South Florida.
Personally, I think fair skin is beautiful. Just look at Liv Tyler and Nicole Kidman to see what I mean. But just about every beauty brand seems to be trying to beckon me to the dark side with self-tanners and self-tanning moisturizers. Although I think it’s great that people have a safe option for getting golden, I don’t quite understand why they feel to need to do so.
Some of the reasons I’ve heard are pretty interesting. “You just look healthier,” I’ve heard a lot. But I associate tanned skin with skin cancer, so it strikes me as the opposite of healthier. However, it seems to be fair skin that prompts concern from others. “Ever heard of the sun?” is one of many comments I’ve gotten from complete strangers.
So, you tell us. Are you a self-tanner or spray-tan fan? Perhaps you still lay out? Why do you feel the need to look tanner than you naturally, safely would? Let us know in the comments section below.