1,828 research outputs found

    Practical Application of NASA-Langley Advanced Satellite Products to In-Flight Icing Nowcasts

    Get PDF
    Experimental satellite-based icing products developed by the NASA Langley Research Center provide new tools to identify the locations of icing and its intensity. Since 1997, research forecasters at the National Center for Atmospheric Research (NCAR) have been helping to guide the NASA Glenn Research Center's Twin Otter aircraft into and out of clouds and precipitation for the purpose of characterizing in-flight icing conditions, including supercooled large drops, the accretions that result from such encounters and their effect on aircraft performance. Since the winter of 2003-04, the NASA Langley satellite products have been evaluated as part of this process, and are being considered as an input to NCAR s automated Current Icing Potential (CIP) products. This has already been accomplished for a relatively straightforward icing event, but many icing events have much more complex characteristics, providing additional challenges to all icing diagnosis tools. In this paper, four icing events with a variety of characteristics will be examined, with a focus on the NASA Langley satellite retrievals that were available in real time and their implications for icing nowcasting and potential applications in CIP

    Differential operators on supercircle: conformally equivariant quantization and symbol calculus

    Full text link
    We consider the supercircle S11S^{1|1} equipped with the standard contact structure. The conformal Lie superalgebra K(1) acts on S11S^{1|1} as the Lie superalgebra of contact vector fields; it contains the M\"obius superalgebra osp(12)osp(1|2). We study the space of linear differential operators on weighted densities as a module over osp(12)osp(1|2). We introduce the canonical isomorphism between this space and the corresponding space of symbols and find interesting resonant cases where such an isomorphism does not exist

    Photometric Supernova Cosmology with BEAMS and SDSS-II

    Full text link
    Supernova cosmology without spectroscopic confirmation is an exciting new frontier which we address here with the Bayesian Estimation Applied to Multiple Species (BEAMS) algorithm and the full three years of data from the Sloan Digital Sky Survey II Supernova Survey (SDSS-II SN). BEAMS is a Bayesian framework for using data from multiple species in statistical inference when one has the probability that each data point belongs to a given species, corresponding in this context to different types of supernovae with their probabilities derived from their multi-band lightcurves. We run the BEAMS algorithm on both Gaussian and more realistic SNANA simulations with of order 10^4 supernovae, testing the algorithm against various pitfalls one might expect in the new and somewhat uncharted territory of photometric supernova cosmology. We compare the performance of BEAMS to that of both mock spectroscopic surveys and photometric samples which have been cut using typical selection criteria. The latter typically are either biased due to contamination or have significantly larger contours in the cosmological parameters due to small data-sets. We then apply BEAMS to the 792 SDSS-II photometric supernovae with host spectroscopic redshifts. In this case, BEAMS reduces the area of the (\Omega_m,\Omega_\Lambda) contours by a factor of three relative to the case where only spectroscopically confirmed data are used (297 supernovae). In the case of flatness, the constraints obtained on the matter density applying BEAMS to the photometric SDSS-II data are \Omega_m(BEAMS)=0.194\pm0.07. This illustrates the potential power of BEAMS for future large photometric supernova surveys such as LSST.Comment: 25 pages, 15 figures, submitted to Ap

    The adaptive buffered force QM/MM method in the CP2K and AMBER software packages.

    Get PDF
    The implementation and validation of the adaptive buffered force (AdBF) quantum-mechanics/molecular-mechanics (QM/MM) method in two popular packages, CP2K and AMBER are presented. The implementations build on the existing QM/MM functionality in each code, extending it to allow for redefinition of the QM and MM regions during the simulation and reducing QM-MM interface errors by discarding forces near the boundary according to the buffered force-mixing approach. New adaptive thermostats, needed by force-mixing methods, are also implemented. Different variants of the method are benchmarked by simulating the structure of bulk water, water autoprotolysis in the presence of zinc and dimethyl-phosphate hydrolysis using various semiempirical Hamiltonians and density functional theory as the QM model. It is shown that with suitable parameters, based on force convergence tests, the AdBF QM/MM scheme can provide an accurate approximation of the structure in the dynamical QM region matching the corresponding fully QM simulations, as well as reproducing the correct energetics in all cases. Adaptive unbuffered force-mixing and adaptive conventional QM/MM methods also provide reasonable results for some systems, but are more likely to suffer from instabilities and inaccuracies.N.B. acknowledges funding for this work by the Office of Naval Research through the Naval Research Laboratory's basic research program, and computer time at the AFRL DoD Supercomputing Resource Center through the DoD High Performance Computing Modernization Program (subproject NRLDC04253428). B.L. was supported by EPSRC (grant no. EP/G036136/1) and the Scottish Funding Council. G.C. and B.L. acknowledge support form EPSRC under grant no. EP/J01298X/1. R.C.W. and A.W.G. acknowledge financial support by the National Institutes of Health (R01 GM100934), A.W.G. acknowledges financial support by the Department of Energy (DE-AC36-99GO-10337). This work was partially supported by National Science Foundation (grant no. OCI-1148358) and used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant no. ACI-1053575. Computer time was provided by the San Diego Supercomputer Center through XSEDE award TG-CHE130010.This is the author accepted version of the article. The final published version is available from Wiley at http://onlinelibrary.wiley.com/doi/10.1002/jcc.23839/full

    Supersymmetric Gauge Theories, Intersecting Branes and Free Fermions

    Get PDF
    We show that various holomorphic quantities in supersymmetric gauge theories can be conveniently computed by configurations of D4-branes and D6-branes. These D-branes intersect along a Riemann surface that is described by a holomorphic curve in a complex surface. The resulting I-brane carries two-dimensional chiral fermions on its world-volume. This system can be mapped directly to the topological string on a large class of non-compact Calabi-Yau manifolds. Inclusion of the string coupling constant corresponds to turning on a constant B-field on the complex surface, which makes this space non-commutative. Including all string loop corrections the free fermion theory is elegantly formulated in terms of holonomic D-modules that replace the classical holomorphic curve in the quantum case.Comment: 67 pages, 6 figure

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Drug-mediated inhibition of Fli-1 for the treatment of leukemia

    Get PDF
    The Ets transcription factor, Fli-1 is activated in murine erythroleukemia and overexpressed in various human malignancies including Ewing's sarcoma, induced by the oncogenic fusion protein EWS/Fli-1. Recent studies by our group and others have demonstrated that Fli-1 plays a key role in tumorigenesis, and disrupting its oncogenic function may serve as a potential treatment option for malignancies associated with its overexpression. Herein, we describe the discovery of 30 anti-Fli-1 compounds, characterized into six functional groups. Treatment of murine and human leukemic cell lines with select compounds inhibits Fli-1 protein or mRNA expression, resulting in proliferation arrest and apoptosis. This anti-cancer effect was mediated, at least in part through direct inhibition of Fli-1 function, as anti-Fli-1 drug treatment inhibited Fli-1 DNA binding to target genes, such as SHIP-1 and gata-1, governing hematopoietic differentiation and proliferation. Furthermore, treatment with select Fli-1 inhibitors revealed a positive relationship between the loss of DNA-binding activity and Fli-1 phosphorylation. Accordingly, anti-Fli-1 drug treatment significantly inhibited leukemogenesis in a murine erythroleukemia model overexpressing Fli-1. This study demonstrates the ability of this drug-screening strategy to isolate effective anti-Fli-1 inhibitors and highlights their potential use for the treatment of malignancies overexpressing this oncogene

    Chromatin States Accurately Classify Cell Differentiation Stages

    Get PDF
    Gene expression is controlled by the concerted interactions between transcription factors and chromatin regulators. While recent studies have identified global chromatin state changes across cell-types, it remains unclear to what extent these changes are co-regulated during cell-differentiation. Here we present a comprehensive computational analysis by assembling a large dataset containing genome-wide occupancy information of 5 histone modifications in 27 human cell lines (including 24 normal and 3 cancer cell lines) obtained from the public domain, followed by independent analysis at three different representations. We classified the differentiation stage of a cell-type based on its genome-wide pattern of chromatin states, and found that our method was able to identify normal cell lines with nearly 100% accuracy. We then applied our model to classify the cancer cell lines and found that each can be unequivocally classified as differentiated cells. The differences can be in part explained by the differential activities of three regulatory modules associated with embryonic stem cells. We also found that the “hotspot” genes, whose chromatin states change dynamically in accordance to the differentiation stage, are not randomly distributed across the genome but tend to be embedded in multi-gene chromatin domains, and that specialized gene clusters tend to be embedded in stably occupied domains

    Algorithmic skin:health-tracking technologies, personal analytics and the biopedagogies of digitized health and physical education

    Get PDF
    The emergence of digitized health and physical education, or ‘eHPE’, embeds software algorithms in the organization of health and physical education pedagogies. Particularly with the emergence of wearable and mobile activity trackers, biosensors and personal analytics apps, algorithmic processes have an increasingly powerful part to play in how people learn about their own bodies and health. This article specifically considers the ways in which algorithms are converging with eHPE through the emergence of new health-tracking and biophysical data technologies designed for use in educational settings. The first half of the article provides a conceptual account of how algorithms ‘do things’ in the social world, and considers how algorithms are interwoven with practices of health tracking. In the second half, three key issues are articulated for further exploration: (1) health tracking as a ‘biopedagogy’ of bodily optimization based on data-led and algorithmically mediated understandings of the body; (2) health tracking as a form of pleasurable self-surveillance utilizing data analytics technologies to predict future bodily probabilities and (3) the ways that health-tracking produces a body encased in an ‘algorithmic skin’, connected to a wider ‘networked cognitive system’. These developments and issues suggest the need for greater attention to how algorithmic systems are embedded in emerging eHPE technologies and pedagogies
    corecore