Image may be NSFW.
Clik here to view.
Clik here to view.

Question by: Maltese Mom
Companies that provide health insurance deduct what they pay as an expense and their employees don’t pay taxes on a huge benefit. That makes health care for those with employer paid health insurance almost free for them. Why shouldn’t your employer pay for your auto insurance, too? Why is this fair to those that have to pay their own medical bills? What type of socialism is this?
Cogent arguments only, please.
Best Answer:
Answer by Independent Voter
Good question. Probably because the government is behind on health care. That is why we need reform.
Add your own answer in the comments!