Let's face it. Shoes are a necessity in the United States. Can you imagine walking around at a store or other public place and people are all barefoot? You won't see this too often in this country. Despite this fact, there are many people that advocate barefoot living, claiming that it's healthier than covering the feet, even when walking out on the