1Florian Kirchschlager,1M. J. Barlow,1Franziska D. Schmidt
The Astrophysical Journal 893, 70 Link to Article [DOI https://doi.org/10.3847/1538-4357/ab7db8]
1Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT, UK; f.kirchschlager@ucl.ac.uk
Core-collapse supernovae can condense large masses of dust post-explosion. However, sputtering and grain–grain collisions during the subsequent passage of the dust through the reverse shock can potentially destroy a significant fraction of the newly formed dust before it can reach the interstellar medium. Here we show that in oxygen-rich supernova remnants like Cassiopeia A, the penetration and trapping within silicate grains of the same impinging ions of oxygen, silicon, and magnesium that are responsible for grain surface sputtering can significantly reduce the net loss of grain material. We model conditions representative of dusty clumps (density contrast of χ = 100) passing through the reverse shock in the oxygen-rich Cassiopeia A remnant and find that, compared to cases where the effect is neglected as well as facilitating the formation of grains larger than those that had originally condensed, ion trapping increases the surviving masses of silicate dust by factors of up to two to four, depending on initial grain radii. For higher density contrasts (χ gsim 180), we find that the effect of gas accretion on the surface of dust grains surpasses ion trapping, and the survival rate increases to ~55% of the initial dust mass for χ = 256.