Most Americans think their country is in decline. In 2018, when the Pew Research Center asked Americans how they felt their country would perform in 2050, 54 percent of respondents agreed that the U.S. economy would be weaker. An even larger number, 60 percent, agreed that the United States would be less important in the world. This should not be surprising; the political atmosphere has been pervaded for some time by a sense that the country is headed in the wrong direction.