90 research outputs found

    The electrochemical interface and stochastic functions: A data-driven approach to modeling non-ideal behavior in concentrated systems

    Get PDF
    Researchers in the ionics field make frequent use of the mass-action principle – or the assumption of ideal thermodynamic behavior – in its physical models. These models are relatively easy to work with, leading to many useful and convenient formulae. However, they are strictly correct only in the limit of infinite dilution, and over-reliance on mass-action models in concentrated systems can lead to models that are grossly incorrect when compared with experimental reality. Recent microscopic experimental results gathered at surfaces and interfaces of ionic and mixed ionic-electronic conductors provide a striking example: classical models utilizing mass-action assumptions routinely underpredict the thickness of defect accumulation zones by an order of magnitude. Although atomistic models can be employed for concentrated systems, their utility is limited to very small simulation domains: continuum models must be used to predict the behavior of devices. A key issue in any continuum-level thermodynamic treatment is the intractability of the microscopic defect interaction problem: beyond the ideal case, very few closed form solutions for the free energy in terms of concentrations are available. This presentation will introduce a data-driven methodology for determining these functions using either experimental or theoretical datasets. The method utilizes Gaussian process stochastic functions to represent the unknown functional relationships between defect concentrations and free energy, and calibrates these functions to data using Bayesian methods for calibration and model selection. A continuum model for the structure of electrochemical interfaces in concentrated systems is the ‘Poisson-Cahn’ theory, which incorporates defect interactions and, crucially, gradient effects in a model that has proven successful in the replication of both macroscopic and microscopic experimental results. The data-driven approach to model building will be demonstrated in the context of Poisson-Cahn variational approaches applied to microscopic experimental datasets for grain boundaries in calcium-doped ceria

    The "Unfriending" Problem: The Consequences of Homophily in Friendship Retention for Causal Estimates of Social Influence

    Full text link
    An increasing number of scholars are using longitudinal social network data to try to obtain estimates of peer or social influence effects. These data may provide additional statistical leverage, but they can introduce new inferential problems. In particular, while the confounding effects of homophily in friendship formation are widely appreciated, homophily in friendship retention may also confound causal estimates of social influence in longitudinal network data. We provide evidence for this claim in a Monte Carlo analysis of the statistical model used by Christakis, Fowler, and their colleagues in numerous articles estimating "contagion" effects in social networks. Our results indicate that homophily in friendship retention induces significant upward bias and decreased coverage levels in the Christakis and Fowler model if there is non-negligible friendship attrition over time.Comment: 26 pages, 4 figure

    Impedance Response of Alumina-silicon Carbide Whisker Composites

    Get PDF
    The impedance response of silicon carbide whisker-alumina composites is investigated utilizing novel stereological techniques along with a microstructural simulation. The stereological techniques developed allow for a measurement of the trivariate length, radius and orientation distribution of whiskers in the composite from measurements made on two-dimensional sectioning planes. The measured distributions are then utilized in a Monte Carlo simulation that predicts connectivity in the composite for a given volume fraction. It is assumed in the simulation that connectivity factors dominate the electrical response, not interfacial phenomena. The results of the simulation are compared with impedance spectra taken from real samples, and conclusions are drawn regarding the nature of the impedance response.M.S.Committee Chair: Rosario A. Gerhardt; Committee Member: Arun M. Gokhale; Committee Member: Hamid Garmestan

    Fast Dynamic System Identification with Karhunen-Lo\`eve Decomposed Gaussian Processes

    Full text link
    A promising approach for scalable Gausian processes (GPs) is the Karhunen-Lo\`eve (KL) decomposition, in which the GP kernel is represented by a set of basis functions which are the eigenfunctions of the kernel operator. Such decomposed kernels have the potential to be very fast, and do not depend on the selection of a reduced set of inducing points. However KL decompositions lead to high dimensionality, and variable selection becomes paramount. This paper reports a new method of forward variable selection, enabled by the ordered nature of the basis functions in the KL expansion of the Bayesian Smoothing Spline ANOVA kernel (BSS-ANOVA), coupled with fast Gibbs sampling in a fully Bayesian approach. It quickly and effectively limits the number of terms, yielding a method with competitive accuracies, training and inference times for tabular datasets of low feature set dimensionality. The inference speed and accuracy makes the method especially useful for dynamic systems identification, by modeling the dynamics in the tangent space as a static problem, then integrating the learned dynamics using a high-order scheme. The methods are demonstrated on two dynamic datasets: a `Susceptible, Infected, Recovered' (SIR) toy problem, with the transmissibility used as forcing function, along with the experimental `Cascaded Tanks' benchmark dataset. Comparisons on the static prediction of time derivatives are made with a random forest (RF), a residual neural network (ResNet), and the Orthogonal Additive Kernel (OAK) inducing points scalable GP, while for the timeseries prediction comparisons are made with LSTM and GRU recurrent neural networks (RNNs) along with a number of basis set / optimizer combinations within the SINDy package

    Integration of high-fidelity CO2 sorbent models at the process scale using dynamic discrepancy

    Get PDF
    A high-fidelity model of a mesoporous silica supported, polyethylenimine (PEI)-impregnated solid sorbent for CO2 capture has been incorporated into a model of a bubbling fluidized bed adsorber using Dynamic Discrepancy Reduced Modeling (DDRM). The sorbent model includes a detailed treatment of transport and amine-CO2-H2O interactions based on quantum chemistry calculations. Using a Bayesian approach, we calibrate the sorbent model to Thermogravimetric (TGA) data. Discrepancy functions are included within the diffusion coefficients for diffusive species within the PEI bulk, enabling a 20-fold reduction in model order. Additional discrepancy functions account for non-ideal behavior in the adsorption of CO2 and H2O. The discrepancy functions are based on a Gaussian process in the Bayesian Smoothing Splines ANOVA framework, which provides a convenient parametric form for calibration and upscaling. The dynamic discrepancy method for scale-bridging produces probabilistic predictions at larger scales, quantifying uncertainty due to model reduction and the extrapolation inherent in model upscaling. The dynamic discrepancy method is demonstrated using TGA data for a PEI-based sorbent and model of a bubbling fluidized bed adsorber. Acknowledgements: This work is supported by the Carbon Capture Simulation Initiative, funded through the Office of Fossil Energy, US Department of Energy

    Set‐Based Design and the Ship to Shore Connector

    Full text link
    The Ship to Shore Connector (SSC), a replacement for the Landing Craft, Air Cushion (LCAC), is the first government‐led design of a ship in over 15 years. This paper will discuss the changes that a government‐led design presents to the design approach, including schedule, organization structure, and design methodology. While presenting challenges, a government‐led design also afforded the opportunity to implement a new technique for assessing various systems and ship alternatives, set‐based design (SBD). The necessity for implementing SBD was the desire to design SSC from a blank sheet of paper and the need for a replacement craft in a short time frame. That is, the LCACs need to be replaced and consequently the preliminary design phase of the SSC program will only be 12 months. This paper will describe SBD and how it was applied to the SSC, the challenges that the program faced, and an assessment of the new methodology, along with recommendations that future design programs should consider when adopting this approach.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/90054/1/j.1559-3584.2011.00332.x.pd

    Measurement of the active width in Sr-doped lanthanum manganate Sofc Cathodes using Nano-ct, impedance spectroscopy and Bayesian calibration

    Get PDF
    Bayesian model-based analysis (BMA) is a method for producing quantitative models of complex physical systems through the comparison between models and experimental data. A model of a porous LSM cathode (symmetrical cell) was applied to impedance data and its parameters estimated via Bayesian calibration. X-ray computed tomography provided microstructural information for the model. The combination of model calibration and microstructural characterization enabled an estimate of the active thickness for a porous LSM electrode. The active width extended only a few nanometers from the surface, strongly suggesting that future models should explicitly resolve the space-charge region

    Exploring DFT+U parameter space with a Bayesian calibration assisted by Markov chain Monte Carlo sampling

    Get PDF
    The density-functional theory is widely used to predict the physical properties of materials. However, it usually fails for strongly correlated materials. A popular solution is to use the Hubbard correction to treat strongly correlated electronic states. Unfortunately, the values of the Hubbard U and J parameters are initially unknown, and they can vary from one material to another. In this semi-empirical study, we explore the U and J parameter space of a group of iron-based compounds to simultaneously improve the prediction of physical properties (volume, magnetic moment, and bandgap). We used a Bayesian calibration assisted by Markov chain Monte Carlo sampling for three different exchange-correlation functionals (LDA, PBE, and PBEsol). We found that LDA requires the largest U correction. PBE has the smallest standard deviation and its U and J parameters are the most transferable to other iron-based compounds. Lastly, PBE predicts lattice parameters reasonably well without the Hubbard correction

    Colonyzer: automated quantification of micro-organism growth characteristics on solid agar

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>High-throughput screens comparing growth rates of arrays of distinct micro-organism cultures on solid agar are useful, rapid methods of quantifying genetic interactions. Growth rate is an informative phenotype which can be estimated by measuring cell densities at one or more times after inoculation. Precise estimates can be made by inoculating cultures onto agar and capturing cell density frequently by plate-scanning or photography, especially throughout the exponential growth phase, and summarising growth with a simple dynamic model (e.g. the logistic growth model). In order to parametrize such a model, a robust image analysis tool capable of capturing a wide range of cell densities from plate photographs is required.</p> <p>Results</p> <p>Colonyzer is a collection of image analysis algorithms for automatic quantification of the size, granularity, colour and location of micro-organism cultures grown on solid agar. Colonyzer is uniquely sensitive to extremely low cell densities photographed after dilute liquid culture inoculation (spotting) due to image segmentation using a mixed Gaussian model for plate-wide thresholding based on pixel intensity. Colonyzer is robust to slight experimental imperfections and corrects for lighting gradients which would otherwise introduce spatial bias to cell density estimates without the need for imaging dummy plates. Colonyzer is general enough to quantify cultures growing in any rectangular array format, either growing after pinning with a dense inoculum or growing with the irregular morphology characteristic of spotted cultures. Colonyzer was developed using the open source packages: Python, RPy and the Python Imaging Library and its source code and documentation are available on SourceForge under GNU General Public License. Colonyzer is adaptable to suit specific requirements: e.g. automatic detection of cultures at irregular locations on streaked plates for robotic picking, or decreasing analysis time by disabling components such as lighting correction or colour measures.</p> <p>Conclusion</p> <p>Colonyzer can automatically quantify culture growth from large batches of captured images of microbial cultures grown during genome-wide scans over the wide range of cell densities observable after highly dilute liquid spot inoculation, as well as after more concentrated pinning inoculation. Colonyzer is open-source, allowing users to assess it, adapt it to particular research requirements and to contribute to its development.</p
    • 

    corecore