We model the effects of repeated supernova explosions from starbursts in the centers of dwarf galaxies on the interstellar medium of these galaxies, taking into account the gravitational potential of a dominant dark matter halo. We explore supernova rates from one every 30,000 yr to one every 3 million yr, equivalent to steady mechanical luminosities of L = 0.1 − 10 × 10 38 ergs s −1, occurring in dwarf galaxies with gas masses Mg = 10 6 − 10 9 M⊙. We address in detail, both analytically and numerically, the following three questions: 1. When do the supernova ejecta blow out of the galaxy, and when is the entire interstellar medium blown away? 2. What fraction of gas escapes the galaxy if blowout occurs? 3. What happens to the metals ejected from the massive stars of the starburst? We give quantitative results for when blowout will or will not occur in galaxies with 10 6 ≤ Mg ≤ 10 9 M⊙. Surprisingly, we find that the mass ejection efficiency is very low in such outflows for galaxies with mass Mg ≥ 10 7 M⊙. Only galaxies with Mg ∼ < 106 M ⊙ have their interstellar gas blown away, and then virtually independently of L. On the other hand, metals from the supernova ejecta are accelerated to velocities larger than the escape speed from the galaxy far more easily than the gas. We find that for L38 = 1, about 97 % of the metals are retained by a 10 9 M ⊙ galaxy, but this fraction is already only 40% for Mg = 10 8 M ⊙ and decreases to 0.27 % for Mg = 10 7 M⊙. We discuss the implications of our results for the evolution, metallicity and observational properties of dwarf galaxies
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.