19 research outputs found
A Preferential Attachment Model for the Stellar Initial Mass Function
Accurate specification of a likelihood function is becoming increasingly
difficult in many inference problems in astronomy. As sample sizes resulting
from astronomical surveys continue to grow, deficiencies in the likelihood
function lead to larger biases in key parameter estimates. These deficiencies
result from the oversimplification of the physical processes that generated the
data, and from the failure to account for observational limitations.
Unfortunately, realistic models often do not yield an analytical form for the
likelihood. The estimation of a stellar initial mass function (IMF) is an
important example. The stellar IMF is the mass distribution of stars initially
formed in a given cluster of stars, a population which is not directly
observable due to stellar evolution and other disruptions and observational
limitations of the cluster. There are several difficulties with specifying a
likelihood in this setting since the physical processes and observational
challenges result in measurable masses that cannot legitimately be considered
independent draws from an IMF. This work improves inference of the IMF by using
an approximate Bayesian computation approach that both accounts for
observational and astrophysical effects and incorporates a physically-motivated
model for star cluster formation. The methodology is illustrated via a
simulation study, demonstrating that the proposed approach can recover the true
posterior in realistic situations, and applied to observations from
astrophysical simulation data
Adaptive Approximate Bayesian Computation Tolerance Selection
Approximate Bayesian Computation (ABC) methods are increasingly used for inference in situations in which the likelihood function is either computationally costly or intractable to evaluate. Extensions of the basic ABC rejection algorithm have improved the computational efficiency of the procedure and broadened its applicability. The ABC - Population Monte Carlo (ABC-PMC) approach has become a popular choice for approximate sampling from the posterior. ABC-PMC is a sequential sampler with an iteratively decreasing value of the tolerance, which specifies how close the simulated data need to be to the real data for acceptance. We propose a method for adaptively selecting a sequence of tolerances that improves the computational efficiency of the algorithm over other common techniques. In addition we define a stopping rule as a by-product of the adaptation procedure, which assists in automating termination of sampling. The proposed automatic ABC-PMC algorithm can be easily implemented and we present several examples demonstrating its benefits in terms of computational efficiency.Peer reviewe
Incorporating Uncertainties in Atomic Data Into the Analysis of Solar and Stellar Observations: A Case Study in Fe XIII
Information about the physical properties of astrophysical objects cannot be
measured directly but is inferred by interpreting spectroscopic observations in
the context of atomic physics calculations. Ratios of emission lines, for
example, can be used to infer the electron density of the emitting plasma.
Similarly, the relative intensities of emission lines formed over a wide range
of temperatures yield information on the temperature structure. A critical
component of this analysis is understanding how uncertainties in the underlying
atomic physics propagates to the uncertainties in the inferred plasma
parameters. At present, however, atomic physics databases do not include
uncertainties on the atomic parameters and there is no established methodology
for using them even if they did. In this paper we develop simple models for the
uncertainties in the collision strengths and decay rates for Fe XIII and apply
them to the interpretation of density sensitive lines observed with the EUV
Imagining spectrometer (EIS) on Hinode. We incorporate these uncertainties in a
Bayesian framework. We consider both a pragmatic Bayesian method where the
atomic physics information is unaffected by the observed data, and a fully
Bayesian method where the data can be used to probe the physics. The former
generally increases the uncertainty in the inferred density by about a factor
of 5 compared with models that incorporate only statistical uncertainties. The
latter reduces the uncertainties on the inferred densities, but identifies
areas of possible systematic problems with either the atomic physics or the
observed intensities.Comment: in press at Ap