The first world war saw chemistry play a vital role – and in more than just poison gas. Mike Sutton looks back

The Great War took most Europeans by surprise, in more ways than one. When the crisis began, they believed diplomacy would prevent – or at least, contain – the conflict. As armies mobilised, all but a few pessimists predicted an early finish. And when the guns started firing nobody – including the generals – foresaw the colossal expansion of the chemical industry needed to supply them with ammunition. When peace returned, there was another surprise. The new chemical giants did not disappear. Instead, they diversified and began transforming the world.

Europeans believed themselves too civilised and sophisticated to let disputes between neighbours provoke a global catastrophe. Even if negotiations failed, a localised contest between professional armies would surely decide the issue in weeks, as had happened in the Franco–Prussian war of 1870 – or so they assumed. Governments and generals based their strategies on that assumption, and quickly found themselves desperately short of explosives. Only chemists could help.

Their responsibilities soon extended beyond feeding the hungry guns. Shortages of many other strategic materials set them challenging tasks, while military stalemate prompted the development of horrifying chemical weapons – an innovation whose consequences still trouble the world (see Chemistry World, June 2014, p46). Aware that delay in the laboratory might mean defeat on the battlefield, researchers and industrialists scrambled to find new processes and improve existing ones. To coordinate this effort, new links were forged between government, the military, industry and academia. They have been strengthening ever since.

Explosive requirements

It seems a little paradoxical that the key element in most chemical explosives is a relatively inert one – nitrogen. In gunpowder, potassium nitrate generated the oxygen needed for rapid combustion of its other ingredients (sulfur and carbon). From the medieval era until the early 19th century, the necessary nitrates were extracted from decayed organic matter found in latrines and dunghills. By the mid-19th century, however, far more powerful explosives like nitroglycerine and TNT were being prepared by using a mixture of concentrated nitric and sulfuric acids to attach NO2 groups to suitable organic molecules. The critical raw materials were now caliche – a nitrate-rich mineral from Chile – and organic substances from various sources, particularly coal tar.

In 1914, existing stockpiles of explosives and their chemical precursors proved woefully insufficient, even for the expected short war. In mid-August, when Germany’s western advance seemed unstoppable, the Berlin War Ministry warned its Ordnance Department that ‘The consumption of munitions in the initial battles has been so great that we must employ all means possible to draw upon and make use of every potential source for munitions production.’

By Christmas 1914, lines of trenches bristling with barbed wire and machine-guns ran from the English Channel to the Alps. Infantry attacks on them were suicidal. Breaking through would require a prolonged artillery bombardment, for which neither side had sufficient ammunition. On the eastern front there was more mobility, but similar shortages prevailed – Russia’s artillery was firing more shells per day than its munitions factories produced in a month.

As hopes of a speedy victory vanished, recriminations began. In May 1915, The Times accused Britain’s Liberal government of mishandling the ‘shell crisis’. In June, the beleaguered Prime Minister, Herbert Asquith, brought Conservative critics into a new coalition administration. David Lloyd George (another Liberal, soon to replace the faltering Asquith as PM) was tasked with revolutionising Britain’s arms industry.

Explosives were far from the only problem area for the British. Mass production of uniforms consumed huge quantities of synthetic dyes, while military and naval expansion required increased supplies of high-quality optical glass for binoculars, telescopes and periscopes. Pre-war demand for both commodities had been met almost entirely by German imports. Domestic production facilities were inadequate, and the necessary skills in short supply.

Nevertheless, explosives were the critical factor in 1915. German U-boats were disrupting Britain’s supplies of strategic materials, while the British naval blockade throttled Germany’s maritime trade – and even without these interruptions, nitrate imports on the peacetime scale would have been inadequate. Chemists and engineers on both sides had to extract nitrogen from the air, forcibly unite it with oxygen, then react it with scarce organic chemicals – all on an unprecedented scale, and at breakneck speed.

Scaling up nitrogen

In Germany, a method of ‘fixing’ atmospheric nitrogen had been proposed by Fritz Haber in 1903, improved in 1909 (on advice from Walther Nernst) and made industrially feasible by Carl Bosch in 1913 (see Chemistry World, May 2013, p48). Nitrogen (distilled from liquid air) and hydrogen (made by passing steam over hot coke), were subjected to high pressure and temperature over a metal catalyst (mainly iron, with some osmium). The ammonia could be converted to nitric oxide, and then to nitric acid, via the Ostwald process, which involved oxidation over a platinum–rhodium catalyst, followed by further oxidation and hydration.


BASF had to develop a new steel alloy to build a high-pressure reactor in its ammonia plant

But Germany’s pre-war strategic planners totally ignored the Haber–Bosch process. In August 1914, BASF’s Ludwigshafen factory was using it to make modest amounts of ammonium sulfate (an agricultural fertiliser), and had no facilities for applying the Ostwald process to produce nitrates. In September, BASF agreed to produce 5000 tons of sodium nitrate per month by April 1915.

Meeting this target required a dash into full-scale production without the interim stage of a pilot plant. Teething problems were fixed as they arose, and the lessons learned applied as larger plants sprang up to meet ever-increasing demands from the military. BASF received some government funding for this project, but the management’s patriotism was bolstered by hopes of adapting their increased capacity to profitable uses when peace returned.

Although Britain’s explosive manufacturers faced similar demands, they rejected the Haber–Bosch method, believing it would be too costly. Instead they adopted the older Frank–Caro process, heating calcium carbide under nitrogen to make calcium cyanamide, which generated ammonia when exposed to steam. This ammonia (plus more recovered from coal-gas production) was converted to nitric acid by the Ostwald method.

Because of the extreme conditions involved, all these processes needed careful management. Haber–Bosch reaction vessels, for example, required a special steel – developed by Krupp – and sophisticated valve systems. Up-scaling was difficult, but ways and means were rapidly found.


While the rush to increase nitrate production continued, both sides struggled with shortages of benzene derivatives, particularly the phenol and toluene needed for making picric acid and TNT. Production of high-strength sulfuric acid also had to be increased substantially.

Distillation of coal tar derived from gas and coke production was the chief source for these aromatic compounds. At the height of the war, Britain’s weekly output of TNT consumed 720?tons of toluene, extracted from 600,000?tons of coal. This was still insufficient for military demands, so TNT was mixed with the cheaper ammonium nitrate to make the explosive amatol. British chemists developed an effective mixture using only 20% TNT, but their German counterparts – though equally short of toluene – would not settle for less than 40%.

Science Museum / Science & Society Picture Library

With so many men fighting at the front, women did essential jobs, such as making cordite

TNT, amatol and lyddite (picric acid) gave high-explosive shells their striking power, but were unsuitable as propellants. For that purpose, the British military favoured cordite, developed in 1889 by the government’s Explosives Committee. Cordite was produced by nitrating cellulose to make guncotton, then adding nitroglycerine and a small amount of petroleum jelly. To make the resulting mixture suitable for extrusion through dies, it had to be mixed with acetone.

In peacetime, destructive distillation of wood produced sufficient acetone but now far more was necessary. A fermentation process – developed by Chaim Weizmann at the University of Manchester, and applied on an industrial scale with assistance from London gin distillers – resolved this bottleneck in cordite production. Weizmann’s discovery had far-reaching consequences: he was an ardent Zionist, and British gratitude towards him encouraged official support for his dream of a Jewish homeland in Palestine.

Germany’s explosive-makers encountered other obstacles. With the blockade restricting cotton supplies, they had to develop methods for making nitrocellulose from rags and wood-pulp. Meanwhile, their nitroglycerine production was limited by lack of glycerine (a by-product of soap manufacture), until a process for making it from sugar by fermentation was up-scaled from laboratory to factory quantities. But despite all these successes, generals on both sides continued to demand yet more explosives.

Health and safety

Chemical manufacturers were also heavily involved in providing materials needed for the medical care and hygiene requirements of millions of troops in the field. Unprecedented quantities of disinfectants, antiseptics, analgesics, anaesthetics and medicines helped make this the first war in which more soldiers died from enemy action than from disease. Nevertheless, all these resources did not stop the influenza pandemic of 1918–19 killing almost three times the number of war dead.

Germany had dominated the pre-war pharmaceutical trade, thanks to research-based corporations like Bayer, creators of the ubiquitous aspirin tablet. In contrast, British and French firms had limited capacity, and many of the drugs they made were covered by German patents. The need for standard medications increased dramatically – France’s annual production of quinine increased from 800kg in 1913 to 90,000kg in 1917. Even so, supplies were inadequate during the influenza pandemic.

Imperial College London

Martha Whiteley’s research group made vital drugs

In Britain, the limitations of the existing pharmaceutical industry drove the government to involve university and college chemistry departments in manufacturing established medicines and developing new ones. In 1915, a group working at the University of Edinburgh introduced one of the most widely used wartime antiseptics – a mixture of calcium hypochlorite and boric acid which, when dissolved in water, released hypochlorous acid. 

Since many qualified men were now in uniform, women did much of the necessary benchwork. Martha Whiteley, for example, headed an all-woman team at Imperial College London which produced important drugs like the local anaesthetic beta-eucaine. The number of female chemistry graduates had increased significantly since 1900, and during the war British universities ran accelerated courses to train more. Thousands of women also performed manual tasks in the chemical industry, and many lost their lives in disasters like the 1917 Silvertown explosion, or became ill through handling toxic substances. Their heroism – and the efforts of women in other essential wartime jobs – made it impossible for any post-war government to go on denying them the vote.

Peace dividend

The war’s sudden end was almost as great a surprise as its beginning. Military leaders had expected it to continue for many more months, or even years. In a way, it did: the prolonged and acrimonious negotiations which followed the armistice sowed the seeds of future conflicts. At this point, one of the many problems confronting the victorious Allies was what to do with Germany’s massive (and heavily militarised) chemical industry.

Science Museum / Science & Society Picture Library

Women became a more common sight in labs during the war

A disarmed and neutral nation did not need chemical plants capable of producing vast amounts of explosives. Yet if Germany’s most advanced and successful export industry was entirely dismantled, how could the war-debt imposed by the Versailles Treaty be paid off? While government officials pondered the issue, British, French and American chemical companies quickly claimed patents held by their German rivals as legitimate war booty.

Allied inspectors examined German chemical plants – ostensibly to assess how far their productive capacity should be reduced, but often with the covert aim of acquiring useful technical data. Companies like BASF responded by halting their more sensitive processes and sequestering key technicians to prevent them being interrogated. A recent study concludes that the Germans eventually won this prolonged chess-game, because they ‘co-ordinated their responses to the Allies, exploiting weaknesses in the inspection system, playing off controllers against each other, and taking advantage of the conflicting interests of the Allies’.

And so, Germany’s chemical industry – or at least, the ‘peaceful’ part of it – survived. Meanwhile, British competitors had learned enough to reverse their earlier opinion of the Haber–Bosch process. The Brunner-Mond Corporation established a large plant on Teesside, but was unable to work it effectively without the specialised knowledge that BASF had carefully concealed from post-war investigators. The problem was only solved with the help of former BASF employees from Alsace – the province annexed by Germany in 1870, and reclaimed by France in 1918.

Chemical conglomerates

In all the former belligerent nations, adjustment to a peacetime economy was difficult for millions of workers. But despite the inevitable run-down of the explosives sector, many chemical plants in Britain, France, Germany and the US were not totally decommissioned. Instead, they were adapted to produce ever-increasing amounts of fertilisers, pharmaceuticals, paints, plastics, textiles and other materials for civilian consumption. Strategic mergers facilitated this expansion and diversification.

In 1925, six large German chemical firms – including BASF, Hoechst and Bayer – united to form IG Farben, for years the world’s largest chemical company. Building on pre-war successes with dyestuffs and pharmaceuticals, it soon dominated many other fields. One of its strategically significant achievements was developing effective methods for synthesising rubber and liquid fuels from coal-derived raw materials. This enabled Germany to survive another economic blockade after 1939.

The strategic importance of the chemical industry was never forgotten

A similar consolidation created Britain’s colossus – Imperial Chemical Industries – in 1926. ICI produced fertilisers, insecticides, dyestuffs and paints on a massive scale. In the 1930s, it developed important new products, like perspex (vital for aircraft cockpit canopies) and polythene (an essential insulator for radar equipment).

At the start of the war, America giant DuPont had already moved into areas beyond explosives manufacture, and was a major stake holder in General Motors. Thereafter it expanded rapidly, supplying Britain, France and the US with essential war materials (at a considerable profit). After 1918, DuPont acquired footholds in other high-tech industries. In the 1930s it introduced nylon, and in the 1940s it played a key role in making the first nuclear bomb.

These large organisations invested heavily in research, often with encouragement and support from governments. The strategic importance of the chemical industry – and of the academic communities might create new possibilities for it – was never forgotten. Consequently, in 1939 the major combatants were far better prepared on the chemical front than they had been in 1914.

Mike Sutton is a visiting fellow in the department of humanities at Northumbria University, UK