Since the 1960s, the U.S. has become a more inclusive country.
This necessarily meant that white men lost some part of their privileged positions in education, employment and entertainment. By the 2000s, in the wake of the “Black Lives Matter” movement, anti-racism books were on the best sellers list, major...