I've always gone barefoot before but sometimes felt ashamed and insecure about it. Now though I've finally realized that it really is more healthy and I now know how I can deal with people who "Look down on me" for going with out shoes on. If asked, I would just tell them that for me going barefoot falls in line with my desire to live a healthier lifestyle. That it falls in line with my interest in herbology and natural foods. I feel I'm getting more confident about it and hopefully care much less what others think. I can't wait for summer.