‘Jim Snow’ Has Begun to Melt

‘Jim Snow’ Has Begun to Melt

The United States has had a majority white population and a minority black population since the nation was born a quarter of a millennium ago. The history of social relations between the two races can be divided into three decades-long periods. 

The first period was from the founding to the end of the Civil War, characterized by citizens—mainly white but including some blacks—owning black slaves. This was not lawful everywhere, and slavery was a hotly debated issue in the decades leading up to the war. By 1861 there were 19 free states and only 15 slave states. Even in the latter, of course, not all white families, and very few black ones, owned slaves. The overall average seems to have been around one family in three or four. Nor were all blacks enslaved: nationwide, about 8 percent of all blacks were free. 

The second period was the Jim Crow era, from the end of Reconstruction in 1877 through to the Civil Rights Act of 1964. Black Americans were second-class citizens under Jim Crow: by law in some states, by widespread social attitudes elsewhere (although not every-where).

The last period, from the Civil Rights Act to the present day, began with a determination that black Americans be treated fairly, on a level with all other citizens. The act allowed courts to order “such affirmative action as may be appropriate” to remedy intentional injustices in all kinds of personnel selection. A presidential executive order the following year required companies with more than 50 employees to have affirmative action hiring programs when seeking government contracts. 

Affirmative action quickly won out over the color-blindness that many civil rights campaigners had hoped would prevail. The easiest way for employers, college admissions officers, and civil service recruiters to stay out of trouble under the new dispensation was to give preference to blacks over nonblacks in hiring, admitting, and recruiting. Further late-1960s legislation, crowned by the Supreme Court’s 1971 decision in Griggs v. Duke Power Co., gave us the odious doctrine of “disparate impact,” according to which personnel selection procedures—aptitude tests, for example—that select disproportionately few blacks may be unlawful, regardless of any provable intention to discriminate.

Ensuing resentments among white citizens were not softened by the promotion, on the part of our cultural, social, and political leaders during the last period of American history, of white guilt about what happened during the first two. We are told that the disadvantages—real and imagined—endured by blacks of those eras sufficiently justify the disadvantaging of whites today. White Americans, believing that they themselves are today’s second-class citizens, sometimes refer sarcastically to this third period as the “Jim Snow” era.

There have been signs these past few months that Jim Snow may have run its course. One very suggestive sign was the case of white Minnesota Mom Shiloh Hendrix, who on May 1 was in a children’s playground in Rochester, 80 miles south of Minneapolis, bearing her infant child on her hip. A black child in the charge of a Somali man tried to plunder the diaper bag that Hendrix was carrying for her own child. Angry and indignant, Hendrix insulted the Somali gent with the taboo n-word, more than once. He filmed her doing so and posted the clip to social media. Hendrix got hate mail and decided to change her address. To help finance the move, she opened an online crowdfunding account. By the end of the month, she raised more than $780,000.

There have been other signs of whites shedding their guilt. Until the day before yesterday, the phrase “black fatigue” referred to blacks weary of suffering white racism. It now more often shows up in social-media posts made by whites who are tired of misbehaving blacks. I find myself wondering if past violations of race taboos—the mid-1990s fuss over The Bell Curve, for example, or the 2013 defenestration of celebrity chef Paula Deen for admitting to using the n-word in the past—would generate as much noise today. Already, at a distance of only five years, the nationwide hysteria that followed the death of George Floyd looks more and more bizarre, like one of those ergot-induced dancing frenzies that sometimes seized villages in Late Medieval Europe.

Black Americans may be getting on board with the decline of Jim Snow, too. In last November’s election, Donald Trump won 20 percent of the black vote. In 2020, it was 13 percent; in 2016, 8 percent. Is it too much to hope that we may be coming together at last, black and white, equal citizens under the law, judged as individuals, not as members of racial groups? 

Then, just as I was pondering this hope, there landed on my desk Jason L. Riley’s new book, The Affirmative Action Myth, with the subtitle: “Why Blacks Don’t Need Racial Preferences to Succeed.”

Riley is black, a seasoned journalist, and the author of several books. The main thrust of this one is right there in the subtitle. Affirmative action, Riley argues, has done nothing for black Americans as a group. Nor, he further argues, have blacks been helped by other, later features of the Jim Snow era: DEI programs, the teaching of Critical Race Theory, or campaigns for reparations.

A running theme in the book is that social progress of black Americans advanced faster under the Jim Crow dispensation than it has under Jim Snow.

During the first two-thirds of the twentieth century, well before affirmative action and an expanded welfare state supposedly came to the rescue, black people experienced significant progress. Education gaps narrowed, incomes rose, and poverty declined. This history hasn’t received the attention it deserves because black politicians and activists have a vested interest in a narrative that accentuates black suffering. The upshot is that a history of social and economic advancement that should be a source of pride for blacks—and a source of inspiration for other minority groups—has received relatively little consideration.

Riley describes—with, indeed, pride—the rise of a robust black middle class through the later Jim Crow years, beginning with the great movement of black Americans from rural South to urban North during and after World War I. A key driver of that rise was what he calls “responsibility politics”: conducting one’s social interactions according to bourgeois norms and values. He laments that in the post-civil rights era, “responsibility politics, as a tactic, has fallen out of favor, notwithstanding its demonstrable effectiveness and popularity.”

That brought to my mind the recent experience of UPenn law professor Amy Wax. In 2017, Wax cowrote an opinion piece with another academic for The Philadelphia Inquirer, calling for social improvement through “the re-embrace of bourgeois norms” such as marriage, hard work, patriotism, and respect for authority. One of the paragraphs in that essay opened with the sentence: “All cultures are not equal.” There followed a list of three cultures “not suited to a First World, 21st-century environment.” Third in that list was “the anti-‘acting white’ rap culture of inner-city blacks.”

That was the only reference to blacks in the entire 840-word article, but in 2017—peak Jim Snow—it was enough to get the writers in trouble with the guardians of propriety. Wax’s troubles continue to this day.

Taking the subtitle of his book and running with it, Jason Riley shows that affirmative action has not been merely a zero for blacks; it has been a net negative. Most obviously, it has put every black professional under suspicion of being an affirmative-action hire of mediocre competence. Riley writes about his own experience under that suspicion. “No one with any self-respect wants to be perceived as a token,” he writes.

The most serious negative consequence of affirmative action has been in college admissions, where it has acquired a name: “mismatch.” That comes from the title of a 2012 book by Richard Sander and Stuart Taylor: Mismatch: How Affirmative Action Hurts Students It’s Intended to Help, and Why Universities Won’t Admit It. The mismatch is between a young person’s ability to digest four years of college-level material and graduate, and the willingness of elite universities to admit him. His SAT and ACT scores indicate the first; the second, under the affirmative action regime, depends on his race.

A black student with merely above-average test scores will likely be admitted by high-ranking universities keen to “diversify” the faces in their promotional brochures;  a nonblack student will likely not. Riley gives us the numbers. Quoting a New York Times 2018 report, he tells us: 

One analysis showed, for example, that an applicant to Harvard with typical credentials had a 25 percent chance of admission if he was Asian. But if you left the credentials the same and simply changed his race to black, the chances of admission climbed to 95 percent. At other selective schools, such as the University of North Carolina, the racial differences in chances of admission were even starker.

Too many of the black students thus admitted will drop out without degrees after finding they can’t handle the coursework, when they could have been accepted into a lower-ranked college, graduated successfully, and embarked on a satisfying, socially useful middle-class career. 

It is therefore not surprising that the academy has been the arena for key battles over affirmative action, with student admissions the usual point of contention. Jason Riley tracks for us the three best-known cases to have reached the U.S. Supreme Court: Bakke in 1978, Grutter in 2003, and Students for Fair Admissions v. Harvard in 2023. Contrary to my own vague impression that Jim Snow peaked between five and ten years ago, the judgments handed down in those cases seem to show a slow leak of gas from the affirmative-action blimp over the entire 45-year period. I shall stick with my impression, though: the Supremes operate under principles all their own.

In the Bakke case—properly Regents of the University of California v. Bakke—the Court ruled that race could be used in college admissions so long as it was not the determining factor in selection. Twenty-five years later, in Grutter v. Bollinger, they upheld that principle, although with the qualification, “The Court expects that 25 years from now, the use of racial preferences will no longer be necessary to further the interest approved today.” It will be interesting to read commentary on Grutter in 2028.

Thence to 2023, and the ruling in Students for Fair Admissions v. Harvard, which at last struck down race preferences in college admissions. Later statistics, however, suggest that some elite institutions—Riley names Duke, Yale, and Princeton—are defying the Court.

Altogether absent from The Affirmative Action Myth is any reference to the human sciences. The author makes no mention of the great advances in our understanding delivered by those sciences in the last few decades, especially in genetics and neuroscience. 

Those advances have solidly affirmed the biological reality of race. Large, old, and mostly inbred populations separated from each other for thousands of generations will diverge in the statistics of heritable traits. These will include the BIP traits—behavior, intelligence, and personality—all of which are to some degree heritable.

In regard to West Indian blacks, for example, Riley tells us that:

Comparisons between native and immigrant populations are complicated by the fact that immigrants are self-selected, meaning that they tend to be more able and ambitious than those who choose to remain in their homeland. One way to control for selectivity bias is to focus on second-generation immigrants who did not make the decision to migrate.

He then quotes statistics on educational attainment, arrest rates, and household incomes, showing that native black Americans compare unfavorably with second-generation West Indian blacks. The cause of those differences, he suggests, is the misguided policies of the Jim Snow era here in the U.S., causing decades of retrogression among American blacks.

To a person acquainted with the human sciences, the expression “genetic confounding” will have occurred at some point when reading that quoted passage. I agree it is probable that black West Indian immigrants to the U.S. are indeed more “able and ambitious” than the average in their home population. (In terms of BIP traits, they are (B) more law-abiding, (I) more intelligent, and (P) more open-minded.) Biology dictates, however, that they will tend to have offspring similarly inclined. Yes, those offspring will regress somewhat toward the black West Indian mean—and their offspring will regress even further—but the comparison with our native blacks is still unfair. Selectivity bias has not been controlled for.

The author’s bio-denialism sometimes descends into blank-slatist woo.

Racial differences in test scores are less a reflection of innate intelligence or class status and more a reflection of a young person’s developed academic capabilities and cultural upbringing.

Uh, wouldn’t “developed academic capabilities” depend rather strongly on innate intelligence?

If the Jim Snow era really is in its closing days, what will come next? Jason Riley hopes for that colorblind meritocracy, which advocates for equal citizenship under the law have been calling for, and for the emergence of a sturdy black middle class fully engaged in American life. I share that hope.

However, cold biological reality will insist that even such an ideal settlement must exhibit racial disproportions. In a fully meritocratic society, black Americans will still be overrepresented in prison populations and underrepresented in the graduating classes of our best universities.The statistical trends are too plain to ignore in both cases, as Charles Murray demonstrated in his 2021 book, Facing Reality: Two Truths About Race in America. Will we ever be fully at ease with that?

https://chroniclesmagazine.org/reviews/jim-snow-has-begun-to-melt