Do you believe that health care is a privilege or a right?
I absolutely believe that health care is and should be a right, 100%.
It's outrageous to me that there's people out there who willingly admit that they would openly deny other human beings the right to get medical treatment all because they don't have insurance. There's this toxic attitude here in America that those without insurance are seen as a nuisance, that they're unimportant or "lazy, not working hard enough", and only seen as a waste of tax dollars and therefore should be denied proper care. And it goes even further than that, because even war veterans aren't being treated after returning home (which is ironic since this country praises the troops only to ignore them when they return, oftentimes with physical to mental ailments), also the fact that there's this issue where some places in this country it's becoming illegal to help the homeless. It's disgusting, it's inhumane, and it needs to be changed. The entire system in general needs to be changed.