For health care to work in America, experts say that patients need to accept less care, and providers need to earn less money.