Extracting astrophysical information from gravitational-wave detections is a well-posed problem and thoroughly studied when detailed models for the waveforms are available. However, one motivation for the field of gravitational-wave astronomy is the potential for new discoveries. Recognizing and characterizing unanticipated signals requires data analysis techniques which do not depend on theoretical predictions for the gravitational waveform. Past searches for short-duration unmodeled gravitational-wave signals have been hampered by transient noise artifacts, or “glitches,” in the detectors. We have put forth the BayesWave algorithm to differentiate between generic gravitational-wave transients and glitches, and to provide robust waveform reconstruction and characterization of the astrophysical signals. Here we study BayesWave’s capabilities for rejecting glitches while assigning high confidence to detection candidates through analytic approximations to the Bayesian evidence. Analytic results are tested with numerical experiments by adding simulated gravitational-wave transient signals to LIGO data collected between 2009 and 2010 and found to be in good agreement