People usually follow a religion because their parents indoctrinated them, or because that religion is popular in their country.  Doesn't this speak loudly against religion?  Doesn't this show that "faith" is not really faith at all but indoctrination, or popularity, or a desperate human need to feel something lies beyond our death, or that something controls everything in the universe?  If the answer to these questions is yes then why be public with your religion?  Shouldn't it suffice for it to be a warm blanket, for it to be a tool that you use to cope with everyday life and to set your moral guidelines?