America needs to have a serious, objective conversation about the damage Ronald Reagan did to this country. Sure, Nixon was a crook and W was an ass but Reagan and his policies are more to blame for the current state of things than any other President in the Post War Era. Prior to The Great Depression and the policies of FDR America had a dog eat dog / social darwinist / every man is an island mentality. Compassion was not the federal government's job. If you were poor or sick that was YOUR problem, not anyone else's. The Great Depression changed all of that.