69,405 research outputs found

    Optimal Data Split Methodology for Model Validation

    Full text link
    The decision to incorporate cross-validation into validation processes of mathematical models raises an immediate question - how should one partition the data into calibration and validation sets? We answer this question systematically: we present an algorithm to find the optimal partition of the data subject to certain constraints. While doing this, we address two critical issues: 1) that the model be evaluated with respect to predictions of a given quantity of interest and its ability to reproduce the data, and 2) that the model be highly challenged by the validation set, assuming it is properly informed by the calibration set. This framework also relies on the interaction between the experimentalist and/or modeler, who understand the physical system and the limitations of the model; the decision-maker, who understands and can quantify the cost of model failure; and the computational scientists, who strive to determine if the model satisfies both the modeler's and decision maker's requirements. We also note that our framework is quite general, and may be applied to a wide range of problems. Here, we illustrate it through a specific example involving a data reduction model for an ICCD camera from a shock-tube experiment located at the NASA Ames Research Center (ARC).Comment: Submitted to International Conference on Modeling, Simulation and Control 2011 (ICMSC'11), San Francisco, USA, 19-21 October, 201

    The DEEP2 Galaxy Redshift Survey: The Voronoi-Delaunay Method Catalog of Galaxy Groups

    Get PDF
    We present a public catalog of galaxy groups constructed from the spectroscopic sample of galaxies in the fourth data release from the Deep Extragalactic Evolutionary Probe 2 (DEEP2) Galaxy Redshift Survey, including the Extended Groth Strip (EGS). The catalog contains 1165 groups with two or more members in the EGS over the redshift range 0 0.6 in the rest of DEEP2. Twenty-five percent of EGS galaxies and fourteen percent of high-z DEEP2 galaxies are assigned to galaxy groups. The groups were detected using the Voronoi-Delaunay method (VDM) after it has been optimized on mock DEEP2 catalogs following similar methods to those employed in Gerke et al. In the optimization effort, we have taken particular care to ensure that the mock catalogs resemble the data as closely as possible, and we have fine-tuned our methods separately on mocks constructed for the EGS and the rest of DEEP2. We have also probed the effect of the assumed cosmology on our inferred group-finding efficiency by performing our optimization on three different mock catalogs with different background cosmologies, finding large differences in the group-finding success we can achieve for these different mocks. Using the mock catalog whose background cosmology is most consistent with current data, we estimate that the DEEP2 group catalog is 72% complete and 61% pure (74% and 67% for the EGS) and that the group finder correctly classifies 70% of galaxies that truly belong to groups, with an additional 46% of interloper galaxies contaminating the catalog (66% and 43% for the EGS). We also confirm that the VDM catalog reconstructs the abundance of galaxy groups with velocity dispersions above ~300 km s^(–1) to an accuracy better than the sample variance, and this successful reconstruction is not strongly dependent on cosmology. This makes the DEEP2 group catalog a promising probe of the growth of cosmic structure that can potentially be used for cosmological tests

    Development of a Design for Manufacturing Tool for Automated Fiber Placement Structures

    Get PDF
    Existing design processes for laminates constructed with automated fiber placement lack significant integration between the various software tools that compose the process. Tools for finite element analysis, computer aided drafting, stress analysis, tool path simulation, and manufacturing defect prediction are all critical parts of the design process. With traditional hand-layup laminates, the analysis performed with each of these tools could be fairly well decoupled from one another. However, for laminates generated by automated fiber placement, the disciplines can become significantly coupled, especially on structures with curvature. This gives rise to a need for integrated design for manufacturing software tools that are able to balance the competing objectives from each discipline. This paper describes the preliminary development of such a tool

    Variational Inference of Joint Models using Multivariate Gaussian Convolution Processes

    Full text link
    We present a non-parametric prognostic framework for individualized event prediction based on joint modeling of both longitudinal and time-to-event data. Our approach exploits a multivariate Gaussian convolution process (MGCP) to model the evolution of longitudinal signals and a Cox model to map time-to-event data with longitudinal data modeled through the MGCP. Taking advantage of the unique structure imposed by convolved processes, we provide a variational inference framework to simultaneously estimate parameters in the joint MGCP-Cox model. This significantly reduces computational complexity and safeguards against model overfitting. Experiments on synthetic and real world data show that the proposed framework outperforms state-of-the art approaches built on two-stage inference and strong parametric assumptions
    • …
    corecore