People are asking whether football has become more violent than ever, but maybe the question we should be asking is whether our culture encourages violence and injury in American sports in general.