Comments for: Should health care in America be a right?
America is the only wealthy country in the world that does not guarantee its people access to health care as a fundamental right. More than 15 percent of Americans are uninsured and many more are seriously underinsured. That was the bottom line message of T.R. Reid, author of… Read More