I believe Americans should have the right to free basic health care. When the world operated in small communities there was also a local doctor or person who practiced medicine. This person always treated everyone in the small community. Pay or not, it did not matter, everyone was considered valuable. This same ideal should translate to larger society. Doctors should understand they have an important role in society and that role should benefit the whole, not just those who can flash money in your face. We are all valuable.
I am a firm believer that every man, woman and child in this country should have the right to receive free basic health care. I also believe that they should not have to go through mountains of red tape or page after page of questions regarding their personal life history.
No, I do not believe that Americans should have the right to free basic health care, because you do not have a right to another person's services. Other people do not owe you something, just because you happened to be born in the same country. If you want health care, get a job and pay for it.
To act like health care is a thing and not a service is to overlook that health care is service industry and calling it a "right" to have others labor for you is to imply that, if they refuse to treat you without being compensated, you have a right to their work. That is the definition of slavery (having a right to the labor of others) and that is not the way health care should be addressed. This is a service that society values and we should always recognize that rights relate to autonomy, not to the labor of others.