Even if you didn’t ‘read English’ at university yourself, you almost certainly know plenty of people who did, and more or less everyone has had to study English literature at school at some point or other. As a subject, ‘English’ (an adjective ...
Even if you didn’t ‘read English’ at university yourself, you almost certainly know plenty of people who did, and more or less everyone has had to study English literature at school at some point or other. As a subject, ‘English’ (an adjective masquerading as a noun) has been central to educational arrangements in Britain for well over a century, seeming for much of that time to occupy a privileged place in the wider culture as well.
Yet literature may seem the most unlikely candidate for becoming a recognized academic discipline. For the most part, science and scholarship have operated with implicit canons of enquiry that have emphasized objectivity, verified knowledge, causal analysis, and impersonal, replicable forms of argument and presentation. But the reader’s encounter with works of imaginative literature does not easily lend itself to such treatment, involving instead subjectivity, degrees of responsiveness, evaluative judgement, and highly individual forms of imaginative re-creation.
As a result, there was initially scepticism about, even considerable resistance to, the idea that the study of vernacular literature might merit a place alongside the new disciplines being established in the expanding universities of the nineteenth century, and even when it had secured a foothold in the curriculum it continued to be derided in some quarters as ‘a soft option’. Surely the reading of enjoyable works of literature in one’s native language, so the objection went, was an activity to be pursued in one’s leisure hours? A university concerned itself with matters of exact scholarship and rigorous reasoning, as in the established disciplines of Classics and Mathematics: appreciation of the beauties of poetry had no claim to rank alongside these strenuous exercises, and, besides, it was clearly impossible to devise an objective way to examine achievement in such a personal, even emotional, activity.
So how did the improbable marriage of beauty and the footnote came to pass; or in other words, how did English, despite these and other objections, establish itself within British universities so successfully that it could sometimes be spoken of by the beginning of the 1960s as the ‘central’ subject in those institutions—even, in some hard-to-define way, as central to the culture at large? The answer to this question cannot take the form of a seamless narrative. We need, for example, to think about some of the larger enabling contextual conditions—the prior reverence for an established canon of English literature, the authority of Classics as a model and a rival, the formative role of history and philology as exemplars of serious scholarship. We also need to examine the relevant institutional developments between the late eighteenth and mid-twentieth centuries: how far was the Scottish tradition of teaching ‘rhetoric and belles-lettres’ a genuine precursor of ‘Eng Lit’; what were the early civic universities actually like; why were Oxford and, especially, Cambridge comparatively late in establishing courses in English; why was English disproportionately prominent in the institutions founded for the higher education of women; and how did these developments relate to what was going on in schools?
Shifting the focus, we need to think about the roles played by some of those who are regarded as among the ‘founding figures’ of the discipline—some who are well-known, such as Matthew Arnold and A.C. Bradley, but also some who are not, such as John Churton Collins, George Saintsbury, Walter Raleigh, and Arthur Quiller-Couch, as well as thinking about the status of the ‘professorial estate’ more generally, looking at its economic circumstances, its recruitments patterns, and so on. And what about the everyday forms of departments, journals, professional associations and so on? They can’t be left out of the story, can they?
Once we’d done all this, we’d be in a position to challenge the conventional accounts of ‘the rise of English’, showing, for example, that I.A. Richards’s supposedly transformative effect on the discipline was in reality more limited, and that the vogue for ‘criticism’ spread more slowly and more unevenly than has been assumed. In fact, we would eventually discover that most English departments at the beginning of the 1960s still had very traditional-looking syllabuses.
At present, ‘Eng Lit’ is widely seen as a discipline in crisis, with reductions in courses and even closures of whole departments being reported across the country. These problems are systemic and there is no one answer to them, but whatever view we take of the current position and future prospects of the study and teaching of English literature, the essential starting point has to be a more adequate account of the history of the enterprise, one that does not reductively depict it in either sinister or salvationist terms.
UK health and social care systems are world leaders in so many ways. Whether it’s leading in medicine and treatments, to providing a social justice-based social care, the system does a great job in supporting the health and additional needs of some of the most vulnerable individuals in society. However, there is no doubt that UK health and social care systems are experiencing significant stress. Virtually every week we are hearing new initiatives from political parties about how they will save the system, or how record amounts of money are being put into the NHS.
The health and social care workforce face difficulties at almost every turn. They are often blamed when serious and distressing events occur, despite doing everything in their power to support those experiencing distress. They have difficulties in workload, satisfaction, looking after extreme events … all of which is against the backdrop of UK Covid lockdowns, where we were implored to stand on our doorstep and ‘Clap for Carers’ all while they were being disproportionately affected by Covid.
The Political Blame Game
In late 2023, the former UK prime minister stated that “we were making progress on bringing the overall numbers [of those on NHS waiting lists] down—what happened? We had industrial action and we got strikes”. Despite NHS waiting lists increasing steadily since 2012, with obvious increases during and following the end of Covid lockdowns, and December 2023 having some of the longest waiting lists ever (although there had been a small decline in that month), the blame is on the workforce for waiting lists that had been increasing year on year since 2012.
Far too often health and social care workers are blamed. The decision of the Conservative government to prevent social care workers from bringing their families to this country from abroad, for example, suggests that the immigration which is needed to keep the care system afloat is a problem. Indeed, nearly one in five of the social care sector are international, and The King’s Fund suggests that without them the sector will struggle to function. As such, governmental actions have inevitably had knock-on effects on the availability of care provision in this country.
We need a political system that supports and guides health and social care workers—not one which demonises and detracts from them.
I would bravely suggest that, even if our health and social care workers could have regular decent wage increases, what would make more of a difference is decent support, at a level which provides the resources they need to make a difference. After all, study after study has shown that this is why they join the sector—to make a difference in the lives of the ill and vulnerable people who live in their very communities.
So what do we need to do to support our health and social care workforce? Well, firstly, claps don’t work. While they started as a nice gesture, they do not make up for the political, societal, and/or organisational issues highlighted above. We need better investment and support of the workforce which is so vital to the UK and beyond. We need to allow health and social care workers to have the resources they need to make a real difference. This will reduce turnover, improve satisfaction, and reduce sickness absence.
This year as usual, on either Remembrance Sunday or Armistice Day, many people in the UK will gather at a local war memorial to remember the country’s war dead, those of the two World Wars and other conflicts since 1945. Lines from Laurence Binyon’s famous 1914 poem “For the Fallen”, beginning ‘They shall grow not old, as we that are left grow old’ will be read and its promise, ‘We will remember them’, will be intoned by the assembled as a civic duty. The whole commemoration has such an air of eternity about it that it is easy to forget that remembrance has a history and it was not ever thus.
Many of Remembrance’s rituals, including poppies and the Two Minutes’ Silence, go back to the Great War. The time and date chosen are a deliberate marker of the end of that war, the guns falling silent at 11am on 11 November 1918. And yet in 1945 there were national debates about whether to inaugurate a separate commemoration for the fallen of the Second World War, with a host of competing proposals, including Victory in Europe Day (8 May) and Battle of Britain Day (15 August). Ultimately, Armistice Day or its nearest Sunday triumphed out of a desire to link together the sacrifice of the dead in both wars as undertaken for the same principles against the same enemy.
The erection of local war memorials, now a seemingly fixed feature of almost every community in the UK, had a more contentious history, for they were substitute grave sites given the government’s policy of refusing to repatriate the war dead—in previous wars the wealthy had been able to return the bodies of their beloved for burial in Britain. A long and sometimes acrimonious campaign was waged by those who wished to bring back the nation’s sons. The Countess of Selbourne branded the ‘conscription of bodies’ as a ‘tyrannical decree’ and the ‘contempt of liberty’, but the government was unmoved. Noting that only the wealthy few could pay to bring home their dead, it clung to a principle of equal treatment to represent a common sacrifice. The official ban on repatriation remained until the Falklands War in 1982.
By contrast, the most striking feature of modern war memorials, the naming of the dead, met with popular support. It was a vast exercise in bureaucracy. Overseas, principally in Flanders, the names of 1,075,293 British and Imperial soldiers were carved in stone in the cemeteries and memorials of the War Graves Commission (another invention of the war, founded in 1917). This exercise in (to adapt the phrase of the historian Thomas Laqueur) hyper-necronominalism, naming the dead, was paralleled at home by local communities erecting their own war memorials. Committees were established, names collected, and decisions made not just about the form of the memorial but who to include (a particular issue was those who died of their wounds after November 1918) and in which order, alphabetical or by rank.
The scale of the memorialization effort is notable—Rudyard Kipling compared it to the erection of the pyramids by the Egyptian pharaohs—but that has often obscured its roots. Naming all the dead, rank-and-file alongside officers, was not new in 1914. Significant efforts had been made in the Boer (or South African War) of 1899–1902 to erect graves and memorials naming all the dead, and that was merely a development of earlier practices, including the Crimean War, 1853–6: by the end of that conflict, British forces had created 120 war cemeteries of varying sizes along the western shores of the Black Sea, most of them identifying the buried by name or initials. Naming the dead was a developing tradition across the nineteenth century, not, as is often believed, a new form of memory for new forms of industrial slaughter in the twentieth century. A contrast is sometimes drawn between the anonymity of the rank-and-file dead of the Battle of Waterloo in 1815 and the preservation of the names of the dead from the Western Front a century later. But perhaps the first British war memorial to name all of the dead, officers and ordinary soldiers, comes from Waterloo: that of the fallen of the 12th Light Dragoons naming 2 sergeant-majors, 4 sergeants, 3 corporals, and 38 privates, which by 1823 had joined a host of memorials in Waterloo church. The roots of Remembrance Day stretch back through the trenches of the First World War to another conflict in the soil of Flanders a century earlier.
An appreciation of the slowly evolving history of war commemoration and remembrance may better equip societies to face the challenges of future conflicts, notably the extensive use of drones and the vastly increased scale of civilian casualties since 1918. For questions of how best to remember are a key part of how to comprehend and perhaps even how to prevent war. Remembering, as they say, is always about the future.
Winston Churchill was born at Blenheim Palace, Oxfordshire on 30th November 1874. His exploits as Prime Minister during the Second World War left an indelible mark on history. To celebrate 150 years since his birth, we have collated the latest research on Oxford Academic to read more about Churchill’s life. Whether you’re a history enthusiast or a curious reader, this collection offers a deep dive into the life and times of a figure who shaped the modern world.
This radical re-interpretation of British history and British Conservatism between 1939 and 1945 reveals the bold, at times utopian, plans British Conservatives drew up for Britain and the post-war world. From proposals for world government to a more united Empire via dreams of a new Christian elite and a move back to the land, this book reveals how Conservatives were every bit as imaginative and courageous as Labour and their left-wing opponents. A study of political thinking as well as political manoeuvre, it goes beyond an examination of the usual suspects—Winston Churchill, Neville Chamberlain, etc.—to reveal a hitherto lost world of British Conservatism and a set of forgotten futures that continue to shape our world.
After a long and prominent career in the British parliament and membership in several British cabinets, Winston Churchill became prime minister in 1940 as World War II was going badly for Britain. He rallied the country with eloquence, expressing a determination not to give in to Nazi Germany but rather to fight to the end. He also set about cultivating a relationship with the American president, Franklin D. Roosevelt, with an eye to securing American assistance and ultimately American participation in the war against Germany.
In Yugoslavia, where Churchill had apparently imposed a clear-cut choice in December 1943, British policy was the subject of lengthy discussions. The problem here was what to do with Mihailović. The Soviets had scored an important point when Churchill shifted British support from Mihailović to Tito, but there too the game was far from over. Churchill asserted that Mihailović should be dismissed immediately and all British missions to the Chetniks withdrawn. Eden, on the other hand, thought it would have been sensible to achieve an agreement with Tito before throwing Mihailović overboard. Churchill had, in essence, failed to understand who Tito really was and what he wanted.
“Naval tradition? Naval tradition? Monstrous. Nothing but rum, sodomy, prayers and the lash.” When Winston Churchill was in charge of the Royal Navy from October 1911 to May 1915 he sought to make drastic reforms, coming into conflict with the naval officers over the traditions of the Royal Navy. Churchill was not just a major architect of welfare reform as President of the Board of Trade and as Home Secretary, but he also continued to push a radical social agenda while running the Navy.
This is not yet another biography of Winston Churchill. It is instead an innovative study of how and why we think what we do about the figure we call ‘Winston Churchill’—and how generations of politicians, historians, and dramatists have manipulated this figure for their own ends. It is a book for those interested in ‘Churchill’ and how this figure has been put to use—as well as Britain’s past, present, and future.
The idea of a “special relationship” between Britain and the United States was articulated by Churchill after World War Two had ended, but for most of its history, the relations between the two nations were often as distrustful as they were friendly. This book tells the story of how a British and American scientific and technological partnership, one that started not long after Britain had lost its ally France and stood alone against Nazi Germany, developed these innovations, which could not be imagined before the conflict began, on an industrial scale.
As First Lord of the Admiralty during the First World War, Churchill oversaw the Gallipoli campaign. As the Western Front developed into a stalemate, Prime Minister Asquith announced a full review of strategic policy to be held during the first week of January 1915. There were major disagreements over strategy (within both army and navy high commands) and much lobbying ensued, with Churchill front and centre of the debates.
Churchill pursued two traditional lines of British foreign policy. He sought to maintain British control over the Mediterranean as the vital connection with its imperial holdings in North Africa, the Middle and Far East. Equally, he opposed Hitler’s expansion as a threat to the balance of power on the continent. He negotiated with Stalin to secure British preponderance in Greece and supported Tito’s Partisans as the most effective resistance in Yugoslavia against the Axis.
Eleftherios Venizelos pursued the question of naval cooperation with Churchill in further talks, during which the British view of Greece’s naval role became clearer—that they should leave the heavy lifting to the British and view themselves as a light-armed gendarme of the Aegean. While the British fleet, with its great capital ships operating out of Argostoli and Malta, would contain the Austrians and Italians in the Adriatic, the Greeks, with small, rapid craft, would police the eastern Mediterranean and the islands.
Before Winston Churchill made history, he made news. To a great extent, the news made him too. If it was his own efforts that made him a hero, it was the media that made him a celebrity—and it has been considerably responsible for perpetuating his memory and shaping his reputation in the years since his death.
The last verse of William Blake’s epic poem written in 1804 reads:
I will not cease from Mental Fight, Nor shall my Sword sleep in my hand: Till we have built Jerusalem, In Englands green & pleasant Land.
Based on the theme of the Book of Revelations and its description of the Second Coming, it asks whether Jesus ever visited England and thus, for a brief moment, created Heaven on earth, while also imploring its readers to create an ideal society today.
Set to music by Sir Herbert Parry in 1916 as the hymn “Jerusalem”, today it is associated with a conventional, even establishment, idea of Englishness—hence being belted out at weddings, England cricket matches, and the last night of the proms. Yet, for much of the twentieth century, the hymn was the great anthem of British socialists in general and the Labour Party in particular. Blake’s revolutionary call to build a new City of God (Jerusalem) was an inspiration and rallying cry for generations of activists who dreamed of a more humane, equal, and cooperative commonwealth rising out of the wreckages of capitalism and the industrial revolution (these ‘dark satanic mills’).
Arguably, no one is more closely linked to this vision, than wartime Labour leader and Prime Minister between 1945 and 1951, Clement Attlee—who insisted “Jerusalem” be sung at his funeral. Winston Churchill’s deputy during the Second World War, Attlee and the Labour Party romped to victory at the postwar 1945 General Election promising to turn Blake’s vision into a reality and build a New Jerusalem out of the rubble of war. Some of Labour’s iconography at the Election literally depicted a new ‘city on a hill’ which Labour would build.
This vision of a New Britain—where the chaos of capitalism would be replaced with socialist economic planning, the fear of ill-health and unemployment with a universal welfare state, slums with new towns, Empire with Commonwealth, competition with amity—was what Labour sought to create during its 6 years in power after war. While, of course, the Attlee governments never lived up to their utopian promise, many of the institutions that they put in place—such as the National Health Service—and the changes they made to Britain’s position in the world—such as Indian independence and the formation of NATO—arguably set the scene for much of the rest of Britain’s postwar history. Not for nothing did Attlee introduce his Party’s manifesto at the 1951 General Election by telling his activists that they were ‘a great crusading body armed with a fervent spirit for the reign of righteousness on earth’ and that they should continue to ‘go forward in this fight in the spirit of William Blake’.
Yet, Labour were not the only radical thinkers and planners during the Second World War. Nor was the future that Attlee built for Britain the only one available. Rather, during the war, Conservatives developed their own set of radical, even utopian, ideas for the future of Britain and the postwar world. From dreams of world government to visions of workers going ‘back-to-the-land’ via their preference for developing a ‘warrior welfare state’ designed to properly reward those in uniform, Conservatives had their own dreams of a ‘Blue Jerusalem’—blue here a reference to the colour most usually associated with the Conservative Party. Equally, Labour activists and politicians were not alone in basing their plans on the creation of a more Christ-like polity and society—something sometimes forgotten in today’s largely secularised political parties. Conservatives too sought to use the Second World War to build their own vision of a new Christian Civilization. The most significant element of the Conservative Board of Education President R. A. Butler’s Education Act (1944) was not the raising of the school-age or even the formalisation of Britain’s tri-partite education system, but the fact that for the first time in British history, State compelled Christian religious education.
Conservatives were not at all happy about the kind of Britain that Labour was pledged to build in 1945—something which, contrary to much of the literature, did not emerge out of the way Britain was governed during the war but which was radically different to it. Instead, a wave of depression swept over much of the Conservative Party in 1945—the other type of ‘blue’ in Blue Jerusalem. Revealed in the thousands of letters sent to Winston Churchill after his defeat in in 1945, these writers described in often acute detail how the removal of Churchill and the election of a Labour government left them ‘depressed’, ‘despairing’, and ‘grieving’ for a Britain and a British Empire that they believed the Conservative Party had built up during the war and which Labour was intent on destroying; one vision of a new society giving way to an altogether different one.