just so everyone knows, this is NOT meant as a knock against christianity. My father and I got into this discussion recently, for the dozenth time, and because of recent events in my life, this becomes a bit more of a heated discussion than usual. As many of you know, my family is very Catholic, and I had moved away from it and converted to Wicca, mainly because of the religion being too conservative. Knowing what the Catholic church teaches about issues like homosexuality, priests being celebate, the roll of women in society, etc... When Jesus was around, he spend his entire life preaching to love everyone equally, irrigardless of their social standing, despite what the old testament said about what prophets claimed God said to them. Even right after his death, apostles who wrote letters about Jesus reverted to some of the Old Testament thinking about these taboo issues, but Jesus never once knocked any group of people ever, except for hierarchy who carried a 'holier than thou' attitude. Now, the Church seems to continue the teachings of the Old Testament, and the teachings after Jesus' life about segragating or persecuting certain 'groups' because of their lifestyles or because of who they are, even though Jesus never seemed to share this opinion. So, question is, Is Church (aka organized Christian religion) and Faith two different things, or the same?