The question so many of us fail to ask is why do we believe that access to health insurance is an entitlement of working people only? (And those who can work at a company that offers health insurance only, while we are at it) Or in other words, how did we get health care and employment to be so mixed together? When you think about it one really doesn't have anything to do with the other except by pretty recent custom. Offering health insurance started out as a perk to attract people. It never was intended to be the only real affordable choice. But over time, that is what happened. The costs shouldered by industry have become a huge burden. (GM who had to be bailed out was talking about this burden a year ago). The cost to the employed and unemployed are now impossible. And instead of government bureaucrats getting between us and our doctors we have company bureaucrats with a profit motive with more accountability to share holders than their customers getting between our doctors and our health. It is time for a real change. The vast majority of Americans no longer buy the "George and Martha" spin sent out with the Clinton Administration tried to make this change. Too many have lost jobs and affordable coverage along with it, or are paying outrageous costs, even with supplements from their employer.