In recent years, there has been a growing interest of using Leak-Before-Break (LBB) concept in the refining industry. LBB methodology has been applied in the nuclear industry for decades, but generally for elimination of hardware for dynamic rupture control at the design stage. However, in the refining industry the purpose of the LBB application could be quite different than for the nuclear industry. In the refining industry the purpose of LBB is to show in-service leakage can be detected well ahead the potential for rupture allowing for ample time for safe replacement of piping.
This paper explores conditions in which LBB can be applied to refinery piping. Initially, the analysis was conducted using a finite element model of a typical pipe system with its design boundary conditions under operating loadings, i.e., gravity, pressure, thermal and hanger loadings. The results with various circumferential crack sizes show a displacement-controlled manner (LBB is easy to satisfy) for the pipe system mainly due to higher secondary stresses, i.e., thermal loadings. However, the pipe system behaved in a load-controlled manner (LBB is harder to satisfy) when some of the boundary conditions were changed simulating a possible support failure and/or hanger failure. This paper investigates how boundary conditions can change a displacement-controlled LBB behavior to load-controlled LBB for a representative pipe system and the implications regarding leak rate detection capability. The effect of material’s toughness reduction due to high-temperature hydrogen-attack (HTHA) damage was also included in these analyses. The procedure outlined here can be applied to a piping system to identify piping supports that are critical for inspection to demonstrate LBB, and the anticipated leak rate before reaching critical flaw size.