Many of us grew up with silly patriotic notions that included "the thesis that a nation has a primary obligation to tend to its own people." Is national sovereignty dead for America? If borders are meaningless and "egalitarianism for all" is the new accepted highest principle, then is there even something we define as the US anymore? Will it be able to feed and clothe and provide medical care for all the world's poorest? How does the progressive left think hard-working middle-class Americans would respond to loss of social security, medicare, and other (legacy nationalistic) programs?