No. There is no universal health care in the United States. President Obama signed the Patient Protection and Affordable Care Act into law back in 2010, but it does not provide universal health care. Mainly, it requires health insurers to cover all of their customers medical needs while requiring everyone to get health insurance in order to help offset the additional expenses to health insurers. You can shop for private plans on state exchanges. If you cannot afford a private plan, you can get government assistance or qualify for expanded Medicare or Medicaid plans. True universal health care, though, isn't going to happen.