During this election period we've noted a frightening level of ignorance in America. Many people interviewed at political rallies evidence little knowledge about history, geography, or religions or cultures different than their own. I saw one video clip in which a man said, "I don't know about electing a Nigra [sic]; after all, this is a Christian country." Now, where would you begin with this guy?

He's far from alone. In The Age of American Unreason, Susan Jacoby relates that a quarter of Texas' high school biology teachers believe that humans and dinosaurs were contemporaries. Evidently we're burdened with an old and endemic anti-intellectualism. My cynical side predicts that during my lifetime I'll be called an "elitist" because I can read.

Popular ignorance threatens democracy. Unless we terminate this creeping disdain for knowledge, we'll lose our nation as we know it. So here's my question: do you know of any ongoing or impending intiatives designed to educate Americans about the rest of the world and America's place in it?