How do water companies ensure that the water that we drink is wholesome and our waste water is clean enough to be released into rivers and seas? Martin Kimber explains all.

How do water companies ensure that the water that we drink is wholesome and our waste water is clean enough to be released into rivers and seas? Martin Kimber explains all.

The development of water and waste water treatment in the UK was kick-started in the summer of 1858. It was the hottest year on record and the heat combined with the filthy condition of the River Thames led to much of the river in London becoming anaerobic. This gave rise to the notorious ’Great Stink’, which was reported to have sent Disraeli fleeing from Parliament, handkerchief to nose. The foul odour arose from the lack of oxygen leading to strongly reducing (anaerobic) conditions in the river and the production of hydrogen sulphide and other objectionable gases. Since then things have only got better.

Not so many centuries ago water was seen as one of the basic elements (along with air, fire and earth). More recently, if water appeared clean then it was considered suitable for drinking and there were very few environmental concerns relating to the use of water for waste disposal. By the early 19th century urbanisation and industrialisation led to the gross pollution of rivers serving the large cities. Waterborne disease became commonplace, and there were increasing problems with providing clean drinking water. Apart from the Great Stink, the other event credited with changing attitudes towards water was the Broad Street cholera outbreak of 1854. Here the physician John Snow demonstrated that the source of an outbreak of cholera in London was water from a well contaminated by nearby sewage pits.

By the mid-19th century, the law required that Thames water supplied to London had to be extracted from the non-tidal river, upstream of London, where the water was cleaner, and that it had to be filtered. Around the same time, Joseph Bazalagette became chief engineer of the Metropolitan Board of Works and began constructing trunk sewers to convey London’s sewage to the lower Thames estuary, where it was stored and discharged on the outgoing tide.

Until surprisingly recently, there were essentially no set water quality standards for drinking water; the requirement was simply that the water be ’wholesome’. By the mid-20th century there was guidance on the microbiological quality required, using coliform bacteria and E. coli as indicator organisms; the presence of these bacteria was taken to indicate inadequate treatment or possible faecal contamination. In the 1950s the World Health Organisation started to set standards for chemicals found in water. These led to the first EC Drinking Water Directive in 1980, the requirements of which were incorporated into British law in 1988. This was the first time in the UK that ’wholesomeness’ was defined by numerical standards.

In 1998 the EC Directive was revised, leading to some upcoming changes in water quality standards.

Treatment process 
The development of water treatment has reflected changes both in our understanding of the chemical and physical aspects of water; and in the definition of wholesome. The earliest form of water treatment was slow sand filtration. This used sand filter beds through which water was passed at a slow rate; treatment was effected partly by physical straining but more importantly by biological action in the layer of biological matter that formed on top of the sand. Such filters have to be cleaned every few weeks by scraping off the top layer. Slow sand filtration is effective but does not disinfect the water. Also, it cannot treat water containing significant concentrations of solids, because the filter rapidly becomes clogged with the material removed. 

By the early to mid-20th century water treatment had developed such that many relatively contaminated lowland rivers could safely be used as a source for drinking water. The treatment process developed is still the basis of most new water treatment plants (Fig 1). Its main processes are:

  • coagulation - to destabilise colloidal matter in the water;
  • flocculation - to encourage the formation of large ’easily-settleable’ particles;
  • settlement or, as a recent alternative, flotation;
  • rapid gravity filtration - using relatively highly loaded sand filters that physically remove solids and are cleaned daily by backwashing with air and water;
  • disinfection - normally using chlorine gas or sodium hypochlorite solution. Until the 1990s pre-chlorination of raw water was very common but this leads to the formation of trihalomethanes in some waters.               

Water treatment

Fig 1. The main processes of water treatment

For many years this process produced water that was considered wholesome. However it essentially removes only particulate matter and it could not meet some of the numerical standards introduced by the first EC Drinking Water Directive, most notably for pesticides and related products, but also for other parameters such as trihalomethanes. This led in the 1990s to the widespread introduction of so-called ’advanced’ processes: adsorption using granulated activated carbon (GAC); and ozonation. The key standard that drove this was the ’pesticide’ standard of 0.1?g l-1 for any ’pesticide’ and 0.5?g l-1 as the total for all ’pesticides’. 

During the 1990s the herbicides simazine and atrazine were widely used by local councils and by the railways for weed control. These were persistent and were found above the standard in virtually all lowland surface water sources and many groundwater sources. Other ’pesticides’ were also found but not to the same extent. The combination of adsorption and ozonation meant that virtually all waters could be treated to comply with the ’pesticide’ standard. By the mid-1990s the use of ozonation and carbon adsorption on lowland river sources was normal practice. 

In 1998 the revised Drinking Water Directive entered into force. This was based more on scientific considerations of acceptable average daily intakes of chemicals of concern, and the proportion of these that could be permitted to be supplied by drinking water (Table 1). It introduced some interesting new standards, notably a much tighter standard for lead, reducing in two stages from the current UK level of 50?g l-1 to 10?g l-1 in 2013, a new standard for bromate (BrO3-) of 10?g l-1, and a standard for arsenic of 10?g l-1

Table 1. Tolerable daily inputs in selected chemicals

Such processes, combining physical and biological treatment, can produce a high quality effluent that in the past would have been considered suitable for discharge to most rivers. However times move on and the process now often requires tweaking to meet modern concerns over nutrient inputs to inland waters. It appears that future treatment improvements will be required to remove the compounds that are causing feminisation of certain male fish through endocrine disruption and which are increasingly associated with discharges of treated wastewater. 

Endocrine disrupters include oestrogens and a wide range of other chemicals (Chem. Br., January 2003, p30). Recent research suggests that, for some chemicals, concentrations of the order of 1ng l-1 can affect fish. Whilst there are environmental standards for some of these chemicals, such as nonylphenol, with a maximum allowable concentration in freshwater of 3.5?g l-1, there are currently no standards for most of them. Also, existing standards might not provide adequate protection of the aquatic environment, because endocrine disruption may be occurring at concentrations below these standards. However, the potential expense of treating such chemicals to very low concentrations may mean that, rather than providing treatment, society will place restrictions on their use to prevent them entering the aquatic environment at harmful concentrations. 

Sewage contains ammoniacal nitrogen - a per capita contribution of around 6g per day as nitrogen. Biological treatment can easily oxidise this to nitrate, removing the oxygen demand that would otherwise be exerted when ammonia is oxidised to water and nitrate. However, nitrate is one of two key nutrients that encourage algal growth in water; the other is phosphorus. Nitrate is removed by adding an anoxic biological reactor, in which biomass utilises nitrate as an oxygen source. Denitrification has been an increasingly common requirement over the past two or three decades (Chem. Br., March 2002, p46). Phosphorus removal is only now becoming widespread. 

Phosphate is easy to remove chemically; dosing of an aluminium or ferric salt will precipitate most of the phosphorus as an insoluble metal phosphate. The problem with this from the waste water treatment plant operator’s point of view is that it is relatively expensive and it produces a chemical sludge. Thus biological removal is currently being developed and introduced. The trick is to enhance phosphorus uptake in the biological treatment and to remove it in the biomass, taking care not to lose it during sludge treatment. 

Cost is becoming ever more important in sewage treatment. Conventional treatment is energy intensive and produces large quantities of sludge that is expensive to treat. Both these problems relate to the use of aerobic treatment. This requires a high energy input to introduce oxygen into the reactor, and aerobic organisms produce a much greater weight of biomass per unit of pollution removed than anaerobic organisms. Anaerobic organisms not only do not need oxygen, they positively abhor it, and they produce methane as a useful byproduct. There would therefore appear to be potential cost advantages in moving to anaerobic treatment. The drawbacks include slow biomass growth rates (making loss of biomass a problem), and the process proceeds very slowly at low temperatures. Anaerobic treatment has been very successful in treating high strength industrial wastes and is becoming more widespread in hot climates, but it has not yet been used as a mainstream process in the UK on domestic sewage. Watch this space. 

 Some key dates in water and sewage treatment

Approx. dateSewage treatment Potable water treatment 
1500BCUse of alum for water clarification in ancient Egypt
1850 Chemical precipitation and land treatmentSlow sand filters coming into widespread use
1880    Louis Pasteur develops theory of germs. Use of coagulation and rapid sand filters begins 
1900 Biological filters becoming common (attached growth)
Chlorination of water begins
1915 Activated sludge process developed (suspended growth)Use of ozone for water treatment disinfection in Europe 
1950 Rotating biological contact system developed 
1990 Nutrient removal becoming commonUse of ozonation and GAC adsorption common in UK

Source: Chemistry in Britain


Martin Kimber, he recently co-wrote Basic water treatment with Chris Binnie and George Smethurst. This third edition of Basic water treatment (ISBN 0 854 04989 4) is published by the RSC and Thomas Telford and can be obtained from the RSC Online Shop price ?29.95 (?19.25 for RSC members).

1. Coagulants and flocculants

Small particles, including soil, algae, bacteria and viruses and colour-causing humic and fulvic acids, can be present in water. These particles, typically between 10nm and 10?m in size, are characterised as colloids. Typically, colloidal systems in natural water are two-phase stable systems, consisting of fine solid particles dispersed in the liquid phase. The majority of colloidal particles in water have a slight negative central charge - which makes the particles repel each other - and they are stabilised by a cloud of positive counter-ions. 

Water treatment companies clarify water by adding a coagulating agent, which ensures removal of ca 90 per cent of the suspended solids. Typically, coagulating agents such as aluminium sulphate [Al2(SO4)3] and ferric sulphate [Fe2(SO4)3], are added. The coagulating agents contain polyvalent cations (eg Al3+ or Fe3+) which, when added to the natural waters, form Al or Fe hydroxide precipitates. 

Coagulation occurs because the colloidal system is destabilised by the presence of the polyvalent ions. These aggregate with the destabilised matter and form a sludge containing the precipitated hydroxides and natural organic matter. The floc can be either lifted by very small air bubbles (air flotation) to form a sludge at the top of flotation tanks, or can be left to settle as sediment sludge in clarification tanks. 

Other coagulants can also be used, including ferrous sulphate, sodium aluminate, activated silica, or cationic polymers, but their suitability depends on the pH of the natural water, which affects the stability of the colloidal system and the formation of hydroxides, and the amount and composition of the suspended solid that the water contains. 

Victoria Ashton  

2. Permeable barriers  

Until recently, cleaning up groundwater has usually involved pumping contaminated groundwater up from the aquifer, treating it above-ground, and then either re-injecting it back into the aquifer or discharging it elsewhere. These methods require continuous energy for pumping water and operating treatments, making them expensive to run, and they often leave residual contaminants at undesirably high levels in the subsurface. 

Now a relatively new technology may offer a cheaper, more effective and more sustainable option. Permeable reactive barriers (PRBs) emerged during the 1990s as a way to treat contaminated groundwater below ground. Canada, led by the University of Waterloo, has been at the forefront of developing this technology which is taking off around the world. There are now about 80 pilot and field-scale PRBs in the US, 20 in Europe and a couple in Australia and Japan, according to Jonathan Smith of the UK Environment Agency’s National Groundwater & Contaminated Land Centre. 

A PRB prevents or reduces contamination whilst allowing groundwater to flow through. It does this either by immobilising the contaminants or by transforming them into less harmful substances, either chemically or biologically. PRBs can be used to treat any contaminated fluid but are most commonly used to remediate contaminated groundwater in aquifers. Smith stresses that PRBs are best applied to point sources of pollution that are creating a plume of polluted groundwater, rather than to widespread pollution. 

PRBs can treat a range of pollutants: chlorinated solvents such as tetrachloroethane, trichloroethane and dichloroethane; metals such as chromium and arsenic; nitrates; and radionuclides such as uranium. ’The type of PRB depends on the type of pollutant’, explains Smith: ’for example, chlorinated ethenes are degraded in the presence of zero valent iron by an abiotic reduction process; or granular activated carbon to adsorb organic pollutants onto its surface’. Other reactive materials in use include organic materials such as wood chip and compost that enhance biological degradation processes, modified clays and zeolites, and other chemical oxidants and reducing agents. 

The most common designs are ’funnel and gate’, where impermeable walls made of materials such as slurry direct contaminated groundwater to gates containing the reactive material, and ’continuous’ where the barrier presents an unbroken wall of permeable materials combined with the reactive materials. Most PRBs are placed around 10-20m deep, although a small number operate at depths of over 40m. 

PRBs have got lots of pluses on their side. They rely on ’passive’ processes so they’re considered an environmentally sustainable technique. They should also have relatively low maintenance and operating costs, although long-term monitoring is necessary. On the down-side, it may take decades to achieve remediation and it may prove difficult to build below-ground structures especially if groundwater is running deep or geological conditions are disadvantageous. 

Nevertheless, the Environment Agency is taking PRBs very seriously, publishing guidelines on PRBs at the beginning of this year. Smith explains that the document provides guidance to industry on how to design, construct, operate, monitor and decommission PRBs; and to EA staff on how to assess a PRB for a regulatory permit. 

The status of PRBs within the UK has been significantly raised by the endeavours of a network of academics and industrialists funded by the Engineering and Physical Sciences Research Council. Its main objective was to educate all interested parties in a technology that the UK was failing to exploit, according to network coordinator Robert Kalin of Queen’s University, Belfast. The network, which has 22 partners, has worked closely with the EA to spread the word. 

Maria Burke 

3. Ultrapure water

UK industry and laboratories use around 1500 million t of mains water per annum. Much of this water has to be purified to a greater or lesser extent in order to minimise problems of scale and corrosion. The highest level of purity obtainable is called ’ultrapure water’, which is many times purer than even the most refined analytical chemicals. 

Ultrapure water is used in a range of applications including:

  • analytical laboratories - HPLC, ICP-MS, trace inorganic and organic analysis; 
  • research laboratories - molecular biology, PCR electrophoresis, DNA sequencing, IVF; 
  • pharmaceuticals - manufacturing injectable products; 
  • semiconductor device manufacture; and 
  • supercritical boiler feedwater.               

However, most of the water used by industry does not need to meet ultrapure water standards. Instead ’primary grade water’ is used. This is produced by a variety of processes including ion exchange (softening and deionisation), electrodialysis and reverse osmosis. Reverse osmosis uses a semi-permeable membrane - a hydrophilic material with a pore size of typically about 0.5nm, which allows water molecules to pass through but rejects about 98 per cent of the dissolved ionic and organic impurities. The membrane also acts as a filter to remove particles and bacteria. The process has gained almost universal acceptance as a result of recent developments in low pressure membranes that have dramatically reduced the operating costs of the process. 

Ultrapure water is generally produced by ’polishing’ primary grade water. Once again, ion exchange and membrane processes are used and are combined with advanced techniques like adsorption and photo-oxidation. In the laboratory, these processes are built into bench-top purification units that produce a few hundred litres per hour, whilst full scale production plants are constructed to generate several hundred tonnes per hour of similar quality water. 

Martyn Fisher, ELGA LabWater, Vivendi Water Systems  

Typical water quality requirements

Potable Primary Ultrapure 
  (? S cm-1)
500 50 0.055
  Total organic 
  (mg l-1C)    
  (cfu ml-1)
100 100  1
  (EU ml-1)
  Overall purity 
TVC = total viable (bacteria) count;cfu = colony forming units;EU = endotoxin units

Contact and Further Information

Martin Kimber
Senior process engineer
Atkins Water, Woodcote Grove, Ashley Road, Epsom, Surrey KT18 5BW