The US Army recently gave a full military funeral to Albert King, a Black soldier killed by a white military police officer in 1941; Charles Bolton considers race in the American South during WWII. OUPblog - Academic insights for the thinking world.
‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ 

OUPblog » History


The US South: A deadly front during World War II

The US South: A deadly front during World War II

The US Army recently gave a full military funeral to Albert King, a Black private stationed at Georgia’s Fort Benning who was killed by a white military policeman in 1941. With this act, the Army completed its acknowledgement of a racial murder it tried to cover up 83 years ago. What the Army or the nation has never fully recognized, however, is that during World War II, in and around Army facilities in the US South (where 80% of Black soldiers trained), an internal war zone raged, one with its own share of casualties, primarily African American GIs.

Ironically, the Army’s effort to enforce the laudable goal of nondiscrimination during wartime helped to stoke racial conflict and violence on this home front battlefield. During World War II, the Army built a biracial army, but one where all units remained strictly segregated by race. At the same time, the military tried to enforce the mandate enshrined in the 1940 Selective Service and Training Act, which officially disavowed racial discrimination. Segregation and nondiscrimination, however, were incompatible goals, especially in the South, where racial segregation was inherently discriminatory.

The Army’s effort to promote nondiscrimination within its own sphere did challenge existing southern racial practices and drew strident criticism from southern white leaders. In 1940, the Army desegregated its Officer Training School at Fort Benning; Black and white trainees lived and learned together before being assigned to their segregated units. Many army camps did not segregate sick or wounded soldiers by race in what was often a single base hospital. In 1942, the Army ordered a ban on the use of offensive language when referring to Black soldiers, a directive unevenly implemented, often dependent on the attitudes of individual commanders. In the summer of 1944, in large part in response to the racial upheaval ongoing in and around military facilities in the South and elsewhere, the Army made its boldest move to embrace nondiscrimination. It declared all recreation facilities and transportation under its control desegregated, although this order was also not always fully implemented by the officers charged with carrying out the directive. The Army took all these steps and others largely for reasons of efficiency and military necessity. For Black soldiers, these actions gave them some sense that they were part of a unified effort to defeat America’s enemies abroad and emboldened them to assert their rights as American citizens at home.

During the war, that home was the local communities that surrounded Army camps. While the Army could try to ensure nondiscrimination on base, off base the Army had no authority to enforce the principle of nondiscrimination. But the Army could not keep its Black soldiers locked on base; every soldier needed time away from their training and their military officers. And outside the camp perimeter, the harsh realities of southern racial segregation remained untouched by the upheaval of war.

As a result, many of the Black casualties of World War II’s “southern battlefield” occurred in the communities located near Army training grounds. In addition to Albert King, African American soldiers killed in the frequent wartime skirmishes in these locales include Henry Williams, a private from Birmingham, Alabama, stationed at Brookley Army Air Field and shot by a white bus driver in Mobile; Raymond Carr, a MP from Louisiana’s Camp Beauregard (and a Louisiana native), shot in the back by a Louisiana state trooper after the lawman told Carr to abandon his post in Alexandria, Louisiana; and William Walker, a private from Chicago, killed by local lawmen while fighting with a white MP just outside the fence of Camp Van Dorn, near the village of Centreville in southwest Mississippi. There were other Black casualties—including some deaths for which we will probably never know all the details, a common occurrence during wartime—as well as hundreds wounded in various beatings and assaults that occurred in the US South’s “war zone.”

Both Albert King and the MP who murdered him in 1941, Robert Lummus, were Georgia natives. Lummus had been at Fort Benning since the spring of 1940, when it was still a white outpost. After the draft began in the fall of 1940, the facility was soon transformed, as thousands of Black soldiers from all over the country arrived at what became one of the country’s largest training facilities. As the US Army began its experiment in promoting nondiscrimination, white soldiers like Lummus remained unmoved. He and others must have believed that Black soldiers at the facility would continue to abide by the South’s existing racial hierarchy. If not, the traditional use of violence to keep Black men in their “place” was a tried-and-true option, even if it meant opening another front at home in the global war of the 1940s.

During World War II, the US Army, through its nondiscrimination efforts, gave African American soldiers a glimpse of America’s racial future. And indeed, the US military would later be the first national institution to abandon racial segregation. The Army’s actions, however, had limits, both within the areas it controlled and certainly beyond. It simply could not change the hearts and minds of most whites, soldier or civilian, overnight.

Feature image: Black soldiers pinning their brass bars on each others shoulders, Ft. Benning, GA 1942. Courtesy National Archives (531137).

OUPblog - Academic insights for the thinking world.

 

The Alexander Mosaic: Greek history and Roman memories

The Alexander Mosaic: Greek history and Roman memories

Perhaps the finest representation of battle to survive from antiquity, the Alexander Mosaic conveys all the confusion and violence of ancient warfare. It also exemplifies how elite patrons across diverse artistic cultures commission artworks that draw inspiration from and celebrate past and present events important to the community. Specificity of visual imagery (e.g., identifiable protagonists, carefully rendered details, and inscriptions) combined with commemorative intent differentiates historical subjects from scenes conceived generically or drawn from daily life. In celebrating events meaningful to those holding power, historical subjects are propagandistic in that they foster a supremely favorable conception of those responsible for their creation. Yet no matter how carefully makers try to control the message, artworks can acquire an autonomy that permits audiences to construct “memories” of those events never intended.

Properly speaking, the Alexander Mosaic’s manufacture comprises Roman work, but most scholars believe it reflects a lost painting described by Pliny the Elder: “Philoxenos of Eretria painted a picture for King Cassander which must be considered second to none, which represented the battle of Alexander against Darius” (NH 35.110). This would date to ca. 330-310 BC, when memories of the battle were still fresh, and its propaganda value would be most effective. That painting may have been brought to Italy as plunder after the Roman conquest of Macedonia in 146 BC. The fact that the mosaic reproduces an earlier work for a later audience forces us to consider the discrepancies between historical narrative and artistic tradition.

All of the surviving accounts of Alexander’s conquests were written against the background of Roman imperialism, and ancient readers necessarily interpreted what they read in the light of the social and political structures that characterized their age. Alexander “the Great” was a Roman creation: the title first appears in a Roman comedy by Plautus in the early second century BC. Because historical representations are distinctive and clearly recognizable to contemporary viewers, since its discovery in the House of the Faun at Pompeii in 1831, scholars have had to reckon with how the mosaic’s imagery functioned in two very different contexts: first as a fourth-century Greek painting and then as a first-century Roman mosaic. A painting celebrating a Macedonian victory meant something quite distinct when originally displayed in a Hellenistic palace than when it was possibly displayed as war booty in a Roman temple; and the mosaic copy in a Roman private house would carry still different significance. For a Roman audience, the commemorative specificity of the battle scene was probably less important than celebrating the qualities of Alexander’s personality that spoke to them: his ferocity in battle, his charisma, and his military genius. Alexander was as much a part of the cultural memory of Rome as Homeric epic was for Greece, providing a paradigm for their own military triumphs.

Heinrich Fuhrmann first suggested that the Roman patron of the artwork had participated in the Macedonian Wars, and that this mosaic copy of a spoil of war functioned as both a sign of his admiration for the “greatest” general and perpetuated the memory of his own role in overthrowing the dynasty that Alexander founded. A Roman viewer might have imagined a broader reenactment of the paradigmatic conflict between East and West, a conflict he may have participated in or merely appreciated through the lens of Roman ideology. Given the Roman taste for the allusive, a history become anachronistic could have also been appropriated and meaningfully reused through a cognitive metaphor whereby in place of Alexander’s empire, Roman viewers could have understood their own (since Rome had conquered the territories formerly occupied by Macedonia). Roman sources repeatedly compare Roman campaigns on the eastern frontier with earlier Greek struggles. Given that Parthia, which had fought on the Persian side against Alexander, was now Rome’s enemy in the east and Alexander’s legacy was now Roman, a Roman viewer could have easily identified with the Greeks. Furthermore, the patron who commissioned the mosaic copy belonged to the new Roman ruling class, which appropriated older Greek artworks—the fruits of their conquest—to express social status. It was prominently featured in a luxury dwelling, of a type also of Greek origin, whose colonnaded courtyards and receptions rooms were sumptuously decorated with other paintings and sculptures meant to impress visitors. Its Roman owner may even have appreciated the Alexander Mosaic as a “work of art”: an image divorced from its original context by its new role in a Roman social performance.

When artworks reconstruct a past in order to explain the present, their makers determine which events are remembered and rearrange them to conform to the required social narrative. Their display provides visible manifestations of collective memories. More than merely passive reflections, monuments with historical subjects reinforce those memories and confer them prestige. Divergent motivations were again in evidence after the Alexander Mosaic’s discovery when various European leaders such as the Prussian King Fredrick Wilhelm IV ordered copies of the copy: was the motivation for such modern commissions the desire for prestige achieved through association with a masterpiece from antiquity or with the political symbolism of its historical subject?

Featured image: Alexander Mosaic (ca. 100 BCE), Naples, Museo archeologico nazionale. Berthold Werner via Wikimedia Commons.

OUPblog - Academic insights for the thinking world.

 

Forgotten books and postwar Jewish identity

Forgotten books and postwar Jewish identity

In recent years, Americans have reckoned with a rise in antisemitism. Since the 2016 presidential election, antisemitism exploded online and entered the mainstream of American politics, with the 2018 shooting at Pittsburgh’s Tree of Life synagogue marking the deadliest attack on American Jews. But this is hardly the first season for grappling with domestic bigotry and racism. Eighty years ago, in the wake of World War II, Americans began addressing some of their own antisemitism and racism problems. They wondered how Americans could fight a war abroad against fascist enemies when they had so many of their own sins of bigotry to reckon with at home. Several popular books—fiction and non-fiction—addressed these issues during the 1940s but are mostly forgotten today. I discuss some of them in my new book, Postwar Stories: How Books Made Judaism American.

Laura Z. Hobson’s bestselling novel, Gentleman’s Agreement (1947) is the most famous of this group of popular 1940s anti-antisemitism novels; less than a year after publication, Agreement was made into an Academy Award-winning film starring Gregory Peck. But Hobson was not alone in thinking and writing fiction about American antisemitism. She was inspired by other successful women anti-antisemitism novelists. As Hobson wrote to her editor, Richard Simon, of the publishing house Simon and Schuster, “Maybe six other authors are right this minute finishing novels on the same subject—maybe not one will do much by itself, but perhaps all together those authors could become a kind of force for ending the complacency of uncomfortable or scared silence which defaults to the rantings of the bigots, who don’t practice that conspiracy of silence at all.”

Several writers were, in fact, working on anti-antisemitism novels. Hobson’s writer-friend Margaret Halsey had published Some of My Best Friend Are Soldiers, a novel attacking racism and antisemitism. As Hobson wrote to Simon, she was also encouraged by the news of the Canadian novelist Gwethalyn Graham’s Earth and High Heaven (1944), a popular anti-antisemitism novel, being serialized in Collier’s magazine. And although Cleveland-based novelist Jo Sinclair (the pen name of Ruth Seid) was farther afield from Hobson’s New York literary circles, by 1946 it would be difficult for Hobson to miss the many New York Times references to Sinclair and her award-winning anti-antisemitism novel, Wasteland, published that year. Through different narrative strategies, these women writers made antiantisemitism into a subject fitting for popular fiction.

These novels also succeeded in making what had been considered a Jewish problem—something for Jewish communal leaders and defense organizations to worry over—into an American problem that required an American solution.

But it was precisely this approach that made some reviewers critical of what Hobson and other anti-antisemitism novelists accomplished. They asked: where was the Jewishness in these novels? Why had novelists not provided readers with more of an understanding of the religious traditions, rituals, and joyous festivals at the heart of Jewish life? To some rabbis and Jewish writers who realized how little Americans understood about the distinctiveness of Judaism, it seemed to many like a wasted opportunity.

Rabbis and other writers invested in Jewish religious life stepped in to fill the void. They seized the opportunity to present Judaism to a readership of Jews and non-Jews. In books with titles such as What Is a Jew? (1953); What the Jews Believe (1950); Basic Judaism (1947); Faith through Reason: A Modern Interpretation of Judaism (1946); and This is Judaism (1944), writers explained the basics of Judaism. In some ways, it is possible to see the anti-antisemitism genre as having paved the way to the “Introduction to Judaism” genre. These primers on Judaism were books and magazine articles that helped explain Jews and their religion to other Americans. In unexpected ways, increased concern over antisemitism led to greater understanding of what it meant to live a Jewish life.

In the past 60 years, the anti-antisemitism novels of the 1940s and the Introduction to Judaism books of the 1940s and 1950s have faded in popularity. These books and articles were very much of their moment. But they forged genres that proved lasting in American culture: anti-antisemitism remained a popular theme in late twentieth century film, with examples such as School Ties (1992) and Driving Miss Daisy (1989), and the Introduction to Judaism genre continued to flourish at this time, with popular examples written by Anita Diamant, Rabbis Irving Greenberg, Hayim Donin, and David Wolpe, as well as Sarah Hurwitz, Noah Feldman, and Rabbi Sharon Brous in more recent years.

The ideas disseminated by these mid-twentieth century genres have also had a lasting impact on American culture. Americans continue to be outraged by antisemitic incidents in this country. There is still a huge discrepancy between the 1920s through early 1940s era, described in Postwar Stories, when antisemitism was much more accepted as part of the American Way—and the post-1940s reality, when antisemitism continued but lessened and was increasingly called out and interpreted as an affront to American values. As a result of the mid-twentieth century “religion moment” described in Postwar Stories, Americans continue to classify Jews as members of an American religion, despite the problems inherent in that categorization: we all know Jews who consider themselves proudly Jewish, but not religious.

Today, we live in a culture that is very much a result of the ideas and attitudes these genres helped to inculcate. With increased antisemitism and questions about the meaning of Judaism during an era when Jewishness has become a more challenging identity, we may find Americans making their way back to these mid-twentieth century genres.

Featured image credit: Dorothy McGuire, Gregory Peck & Sam Jaffe in a scene from the 1947 film Gentleman’s Agreement. Public domain via Wikimedia Commons.

OUPblog - Academic insights for the thinking world.

 

The rising power paradigm and India’s 2024 general elections

The rising power paradigm and India’s 2024 general elections

India, the world’s largest democracy, is holding its national elections over a six-week period starting 19 April. The elections to the 543-member lower house of the parliament (Lok Sabha) with an electorate, numbering 968 million eligible voters, assumes critical importance as India is going through both internal and external changes that are heavily linked to its rising power aspirations and achievements. The ruling Bharatiya Janata Party (BJP), led by Prime Minister Narendra Modi, has been campaigning on the claim that under his leadership, India’s global status has improved substantially and that he is determined to make India a great power and developed country by 2047, the centenary year of independence. The growing Hindu middle class seems to agree. According to a February 2023 Pew survey, Modi had a 79% favorable approval rating. More interestingly, some 85% of Indians surveyed by Pew think a strong authoritarian leader or military rule is preferable to multi-party electoral democracy, the highest for any country surveyed.

Since its economic liberalization in 1991, in terms of comprehensive national power, including both hard and soft power markers, India has made substantial progress—in some areas more than in others—even though it still lags behind China in many indicators of material power and social welfare. The critical factor is the steady economic growth rate ranging from 6 to 8% over the past three decades. The $4 trillion economy, which recently overtook previous colonial ruler Britain to reach the fifth position in the world, is poised to become number three by 2030. The tactical and strategic advantages India has made under somewhat favorable geopolitical circumstances are many, but these could easily erode if its soft power foundations, especially democracy, secularism, and federalism, decline even further.

The $4 trillion economy, which recently overtook previous colonial ruler Britain to reach the fifth position in the world, is poised to become number three by 2030.

The implications of the elections to India’s rise as an inclusive democratic state is potentially far reaching. If the BJP wins a two-thirds majority, concerns are heightened that it would amend the Indian constitution, altering its core principles of liberal democracy and secularism and declare India a majoritarian Hindu state. India’s status advancement in recent years has benefitted the ruling establishment. Modi’s achievements are built on the foundations laid by the previous Congress Party-led governments of Prime Ministers P.V. Narasimha Rao and Manmohan Singh. India’s 2005 rapprochement with the US and its opening to the world, especially to East Asia, Southeast Asia, and the Middle East, occurred during that period. It was Rao and his Finance Minister Singh who opened the Indian economy to the world through their wide-ranging economic reforms in 1991. The economic growth was also very robust during much of Singh’s tenure. Many of the social programs were started during that period, but Modi has improved on their delivery by introducing direct transfer and also adding new welfare programs guaranteeing the poor subsided rations and cooking gas. Some 300 million Indians were lifted out of extreme poverty during Singh’s term in office alone, and a similar number may have come out during BJP rule. Yet India still hosts some 12% of its 1.4 billion population below the poverty line (considered as $2 a day) while 84% have an income less than $7 a day.  

If the BJP wins a two-thirds majority, concerns are heightened that it would amend the Indian constitution, altering its core principles of liberal democracy and secularism and declare India a majoritarian Hindu state.

The previous Congress regime’s inability to cash in on their achievements for electoral gains is in direct contrast to Modi’s success in presenting a different image to the public on India’s economic and military achievements and general international status advancement. Skillful propaganda, especially using social media, has enabled this. India’s swing power role in the Indo-Pacific, in terms of balancing China’s rise and aggressive behavior, has helped India’s geopolitical prominence and Modi has astutely used it for his own electoral successes. He has used contentious religious nationalism, including the building of a temple in Ayodhya over a destroyed Muslim mosque, repealing the Article 370 of the Constitution which gave Jammu and Kashmir special autonomous status, and adding programs to allow citizenship to displaced minorities (excluding Muslims) from neighboring Pakistan and Bangladesh, to solidify his support among ardent Hindu-nationalist groups. The 18 million-strong Indian diaspora contains many pro-Hindu groups that have helped Modi’s efforts by offering financial and moral support.

Although the rising power claim may have helped Modi’s possible third term re-election, there is another side to this story. Some of the BJP government’s internal policies may, in the long-run, undercut the status achievement by putting its legitimacy and sustainability in question. The number one challenge is the democratic backslide that has been happening under the BJP rule. Today India is ranked at 66 as a ‘partly free country’ by Freedom House, and the rating agency V-Dem recently demoted India as an ‘electoral autocracy.’ A number of measures curtailing freedom of expression and other essential democratic rights have occurred under Modi, denting India’s democratic credentials, one of its key soft power assets. Similarly, secularism, another soft power marker of India since independence, has been reduced as there is a direct effort to assert the Hindu majoritarianism as visualized by the BJP and its militant ideological arm, the Rastriya Swayam Sevak Sangh (RSS).

The democratic backsliding presages considerable difficulties to legitimizing India’s status as a liberal democratic rising power. The major challenges to freedom of expression, the party’s increasing ideological control of India’s judiciary, and the attacks on minority rights, as well as harassment an arrest of opposition leaders using governmental agencies such as the Enforcement Directorate, all portend the emergence of an illiberal state even when elections are held periodically. While Hindutva (Hindu-ness) aimed at the hegemony of Hinduism over all other religious groups has increasing sympathy among the Hindu electorate and sections of the diaspora, it is still to obtain any international traction as an attractive ideology or model for political order. It is yet to offer a coherent and convincing agenda for the emerging world order.

The father of the nation, Mahatma Gandhi, used Hindu and Buddhist religious ideas such as Ahimsa (non-violence), among others, to develop his model of non-violent struggle. Can Modi in his third term make a conscious effort to develop India as an inclusive, democratic state, and bring peaceful and tolerant aspects of Hinduism to the fore? Or will Indian democratic exceptionalism evolve into an entrenched populist majoritarian system with all its attendant challenges for democratic freedoms, even while India makes substantial material progress? The simultaneous democratic backsliding in many countries, including the US and Europe, does not help India’s prospects in this regard. India may still receive a higher geopolitical position (in the context of China’s rise) and the steady economic growth that would allow it to emerge as a key destination for trade and foreign investment, and a source of technically qualified workforce and migrants for the next two decades or more. India’s greater inclusion in global governance is needed for reasons of equity, efforts at solving many collective action problems, and greater effectiveness of international institutions. The peaceful accommodation of India will alter the historical patterns of rise and fall of great powers through war. Whether it will be a peaceful process internally is yet to be determined. The forthcoming elections will establish India’s trajectory in a colossal way both for its domestic politics and foreign relations.

Feature image by Graphic Gears on Unsplash, public domain

OUPblog - Academic insights for the thinking world.

 

Remembering John Hope Franklin, OAH’s first Black president

Remembering John Hope Franklin, OAH’s first Black president

The 2024 OAH Conference on American History begins in New Orleans on 11 April, almost exactly fifteen years after the death of the organization’s first Black president, John Hope Franklin. Franklin’s life embodied the conference theme of “being in service to communities and the nation,” and the annual meeting offers an opportunity to reflect on his extraordinary body of work and how it speaks to the present moment.

Franklin’s seven-decade career defies overstatement. He earned his PhD from Harvard in 1941, marched from Selma to Montgomery in 1965, became the president of the OAH in 1975, was named by President Bill Clinton to lead the Advisory Board to the President’s Initiative on Race in 1997, and, in 2006, won the John Kluge Prize from the Library of Congress for lifetime achievement in the humanities. He published over two dozen books and 100 articles. His influence as a teacher and mentor is incalculable.

Franklin came to prominence in the middle years of the twentieth century, and his work during this period, both inside and outside of the academy, continues to resonate as the history profession confronts right-wing attacks on the teaching and study of Black history. In 1947, he published his third book, From Slavery to Freedom, which placed African Americans at the center of a story so long dominated by white figures. Like W. E. B. Du Bois, Rayford Logan, and other pioneering Black scholars before him, Franklin emphasized what serious historians came to accept as an essential fact: Black history is American history. From Slavery to Freedom revolutionized the field; as historian Paul Finkelman writes in his essay on Franklin for the American National Biography, the book “[made] it possible for African American history to be taught outside of historically black colleges and universities.” It would go on to sell over three million copies across nine editions and remains in print today.

A year later, Thurgood Marshall of the NAACP Legal Defense Fund asked Franklin to serve as an expert witness in the case of Johnson v. Board of Trustees of Kentucky, in which African American student Lyman T. Johnson sought to enter the graduate program in history of the University of Kentucky, which only admitted white students. Franklin, with the help of sympathetic white professors at UK, mined official records and showed that the designated Black school, Kentucky State College, did not offer a comparable education. In 1949, the US District Court ruled that Johnson must be allowed to enter the University of Kentucky. In 1998, the university gave Franklin an honorary doctorate.

Marshall called Franklin again in 1953. By this point, the NAACP had won significant victories in Supreme Court cases that eliminated “separate but equal” in graduate and professional programs. Marshall and key NAACP lawyers, including Constance Baker Motley, were readying cases that would force the Court to rule on segregation in primary and secondary schools. He asked Franklin to conduct research on the Fourteenth Amendment to bolster the argument that school segregation was unconstitutional under the equal protection clause. “As only Thurgood Marshall could put it,” Franklin recalled in a 2007 interview with historian Ray Arsenault, “he threatened me in a way that I knew that I was going to be in danger if I didn’t accept his invitation or his command.” Franklin arranged his fall schedule so that he could teach at Howard University Monday through Wednesday morning and then travel to New York to work with the NAACP through the weekend.

The NAACP won the case that became Brown v. Board of Education, but Franklin was “bitterly mistaken, tragically mistaken” in thinking that the Supreme Court’s ruling would force southern officials to integrate schools right away, or, as the Court so vaguely put it the following year, “with all deliberate speed.” Franklin also knew that segregation was not an issue peculiar to the South. Franklin had studied the problem as an historian, but he also had first-hand experience. He moved to New York City in 1956 to chair the history department at Brooklyn College, but he struggled to find a home within walking distance to the school; realtors refused to show him houses because of his race, insurance companies balked at working with him, and banks refused to approve his loan. When he eventually bought a house—thanks to his lawyer’s father being on the board of a bank—he and his family faced constant harassment from white neighbors. The continuous struggle against racial discrimination, past and present, would motivate Franklin’s work for the rest of his life.

Franklin died on 25 March 2009, not three months after the inauguration of Barack Obama as the country’s first Black president. Bill Clinton spoke at his memorial service. In 2015, at Duke University’s celebration of Franklin’s centennial, Harvard president Drew Gilpin Faust mused, “For John Hope Franklin, history was a calling and a weapon, a passion and a project.” He understood “history itself as a causal agent and on the writing of history as mission as well as profession.” The OAH conference reminds us to consider Franklin’s legacy. It also allows us to celebrate how historians today have followed in his footsteps, untangling America’s past through honest research and skilled interpretation even as politicians and opinionmakers undermine the teaching of race, slavery, and the diversity of American experiences.

Feature image by World Maps via StockSnap, CC1.0.

OUPblog - Academic insights for the thinking world.

 

Contact UsPast IssuesJoin This ListUnsubscribe

 

Safely Unsubscribe ArchivesPreferencesContactSubscribePrivacy