What the hell is this about? I'm so sick of people doing it, and my friend saying "You're so white!", after he's gone to the salon, I'm like "Well it's kind of been winter for 6 months!". Besides, if it just stopped snowing and you're tan, people are gonna know. Personally, I don't think having darker skin is worth the SKIN CANCER. It's ridiculous that looking "good" is worth that, and it's just pointless. Who the hell cares? I never purposly tan, if I'm outside and get one, great, if not, pff, it doesn't matter.