Please login to view abstract download link
The structural integrity and safety of nuclear materials rely on accurate measurement and calculation of fracture toughness throughout their operational lifespan. The use of sub-sized compact tension (CT) specimens to measure the fracture toughness of fusion power plant alloys is necessary due to the space constraints in irradiation facilities within the nuclear research. However, investigations have revealed that sub-sized specimens often underestimate crack growth resistance [1]. This discrepancy primarily is thought to be due to the significant size of the plastic zone and the pronounced stress triaxiality effect near the crack tip in such specimen geometries. To address this issue, size correction methods should be applied to account for the size effect, aiming to achieve consistent J-R curves between sub-sized and standard-sized specimens. The current study investigates the size effect in fracture toughness calculations of Eurofer97 through numerical analysis including both irradiated and unirradiated cases. Experiments are performed with standard and sub-sized CT specimens under room temperature and at higher temperatures. Numerical analyses are conducted using manually imposed crack lengths to obtain J-integral values by using finite element analysis. Size correction methodologies according to the ASTM standards are applied on fracture toughness calculations, allowing for comparison between specimens of different sizes. Results are compared with the experimental findings and the performance of size correction methodologies is investigated.