Received: 17 Feb 2012 – Discussion started: 02 Apr 2012
Abstract. Corrosion in water supply networks is unwanted process that causes pipe material loss and subsequent pipe failures. Nowadays pipe replacing strategy most often is based on pipe age, which is not always the most important factor in pipe burst rate. In this study a methodology for developing a mathematical model to predict the decrease of pipe thickness in a large cast iron networks is presented. The quality of water, the temperature and the water flow regime were the main factors taken into account in the corrosion model. The water quality and flow rate effect were determined by measuring corrosion rate of metals coupons over the period of one year at different flow regimes. The obtained constants were then introduced in a calibrated hydraulic model (Epanet) and the corrosion model was validated by measuring the decrease of wall thickness in the samples that were removed during the regular pipe replacing event. The validated model was run for 30 yr to simulate the water distribution system of Riga (Latvia). Corrosion rate in the first year was 8.0–9.5 times greater than in all the forthcoming years, an average decrease of pipe wall depth being 0.013/0.016 mm per year in long term. The optimal iron pipe exploitation period was concluded to be 30–35 yr (for pipe wall depth 5.50 mm and metal density 7.5 m3 t−1). The initial corrosion model and measurement error was 33%. After the validation of the model the error was reduced to below 15%.
How to cite. Bernats, M., Osterhus, S. W., Dzelzitis, K., and Juhna, T.: Development of a iron pipe corrosion simulation model for a water supply network, Drink. Water Eng. Sci. Discuss., 5, 85–120, https://doi.org/10.5194/dwesd-5-85-2012, 2012.