The Left Behind
In striking a balance between the drearier and more inspirational aspects of their tale, the co-authors of Radicals in America: The U.S. Left Since the Second World War, tend, on balance, to emphasize the positive. As they argue in their introduction, although the “radical left has always been a minority current” in the United States, it has “propelled major changes and frequently given shape to what Americans broadly take as the nation’s core traditions.”