Does anyone know about tanning or just being out in the sun regularly? I'm 27 years old and have been mostly avoiding direct sun exposure for the last 7 years. Mainly because "they" say it is so bad for you. I'm really pale these days and I think I would not only look better but feel better too if I started getting some sun. Is the skin cancer thing just medical scare like everything else or does it really give most people skin cancer?