My brother, who lives over in Germany, was thoughtful enough to send me a link to an article from CNN about the difference between how Americans and Europeans view nudity. I had a very similar experience when I was living in Europe - living there gave me a sense of ease with the naked body and taught me that it's just not a big deal. On their television there, violence is what they try to shield their kids from, not nudity. What a concept!
Here's the link: article