448 research outputs found

    Parallel algorithms for normalization

    Full text link
    Given a reduced affine algebra A over a perfect field K, we present parallel algorithms to compute the normalization \bar{A} of A. Our starting point is the algorithm of Greuel, Laplagne, and Seelisch, which is an improvement of de Jong's algorithm. First, we propose to stratify the singular locus Sing(A) in a way which is compatible with normalization, apply a local version of the normalization algorithm at each stratum, and find \bar{A} by putting the local results together. Second, in the case where K = Q is the field of rationals, we propose modular versions of the global and local-to-global algorithms. We have implemented our algorithms in the computer algebra system SINGULAR and compare their performance with that of the algorithm of Greuel, Laplagne, and Seelisch. In the case where K = Q, we also discuss the use of modular computations of Groebner bases, radicals, and primary decompositions. We point out that in most examples, the new algorithms outperform the algorithm of Greuel, Laplagne, and Seelisch by far, even if we do not run them in parallel.Comment: 19 page

    Synthetic inversions for density using seismic and gravity data

    Get PDF
    Density variations drive mass transport in the Earth from plate tectonics to convection in the mantle and core. Nevertheless, density remains poorly known because most geophysical measurements used to probe the Earth's interior either have little sensitivity to density, suffer from trade-offs or from non-uniqueness. With the ongoing expansion of computational power, it has become possible to accurately model complete seismic wavefields in a 3-D heterogeneous Earth, and to develop waveform inversion techniques that account for complicated wavefield effects. This may help to improve resolution of density. Here, we present a pilot study where we explore the extent to which waveform inversion may be used to better recover density as a separate, independent parameter. We perform numerical simulations in 2-D to investigate under which conditions, and to what extent density anomalies may be recovered in the Earth's mantle. We conclude that density can indeed be constrained by seismic waveforms, mainly as a result of scattering effects at density contrasts. As a consequence, the low-frequency part of the wavefield is the most important for constraining the actual extent of anomalies. While the impact of density heterogeneities on the wavefield is small compared to the effects of velocity variations, it is likely to be detectable in modern regional- to global-scale measurements. We also conclude that the use of gravity data as additional information does not help to further improve the recovery of density anomalies unless strong a priori constraints on the geometry of density variations are applied. This is a result of the inherent physical non-uniqueness of potential-field inverse problems. Finally, in the limited numerical setup that we employ, we find that the initially supplied anomalies in S- and P-velocity models are of minor importance

    Grounded Theory - wie aus Texten Modelle und Theorien gemacht werden

    Full text link
    Grounded Theory lƤƟt sich als gegenstandsverankerte Theorienbildung Ć¼bersetzen. Damit wird bereits angedeutet, daƟ das Ziel der Arbeiten in der Grounded Theory meist bereichsspezifisch ist, daƟ es nicht darum geht, universell gĆ¼ltige Theorien und Modelle zu bilden. Die amerikanischen Soziologen Anselm Strauss und Bamey Glaser schufen mit der Grounded Theory (GT) eine umfassende Konzeption des sozialwissenschaftlichen Erkenntnis- und Forschungsprozesses. Sie reicht von ersten Ideen zu einer Forschungsfragestellung Ć¼ber die Konzeption einer Untersuchung, Auswahl von Datenmaterial, Analyse und Interpretation von Daten bis zum Erstellen eines Manuskripts. Die GT erlaubt auf der Basis von Forschung in einem bestimmten Gegenstandsbereich eine Theorie zu formulieren, die aus miteinander verknĆ¼pften Konzepten besteht (Netzwerk) und geeignet ist, eine Beschreibung und ErklƤrung der untersuchten sozialen PhƤnomene zu geben

    Extractor for ESI quadrupole TOF tandem MS data enabled for high throughput batch processing

    Get PDF
    BACKGROUND: Mass spectrometry based proteomics result in huge amounts of data that has to be processed in real time in order to efficiently feed identification algorithms and to easily integrate in automated environments. We present wiff2dta, a tool created to convert MS/MS data obtained using Applied Biosystem's QStar and QTrap 2000 and 4000 series. RESULTS: Comparing the performance of wiff2dta with the standard tools, we find wiff2dta being the fastest solution for extracting spectrum data from ABIs raw file format. wiff2dta is at least 10% faster than the standard tools. It is also capable of batch processing and can be easily integrated in high throughput environments. The program is freely available via , and is also available from Applied Biosystems. CONCLUSIONS: wiff2dta offers the possibility to run as stand-alone application or within a batch process as command-line tool integrated in automation and high-throughput environments. It is more efficient than the state-of-the-art tools provided

    The Effect of Education on the Assessment of Optic Nerve Head Photographs for the Glaucoma Diagnosis

    Get PDF
    Background: To evaluate the effect of one lesson of continuing medical education (CME) of subjective assessment of optic nerve head appearance on sensitivity and specificity for the diagnosis of glaucoma. Methods: Ophthalmologists and residents in ophthalmology attending an international glaucoma meeting arranged at Malmo University Hospital, Malmo, Sweden, were asked to grade optic nerve head (ONH) photographs of healthy and glaucomatous subjects at two sessions separated by a lecture on glaucoma diagnosis by ONH assessment. Each grader had access to an individual portfolio of 50 ONH photographs randomly selected from a web-based data bank including ONH photographs of 73 glaucoma patients and 123 healthy subjects. The individual portfolio of photographs was graded before and after the lecture, but in different randomized order. Results: Ninety-six doctors, 91% of all attending the meeting, completed both assessment sessions. The number of correct classifications increased from 69 to 72% on the average. Diagnostic sensitivity increased significantly (p < 0.0001) from 70% to 80%, and the number of photographs classified as uncertain decreased significantly (p < 0.0001) from 22% to 13%. Specificity remained at 68%, and intra-grader agreement decreased. Conclusion: CME had only a small effect on the assessment of ONH for the glaucoma diagnosis. Sensitivity increased and the amount of uncertain classifications decreased, while specificity was unchanged

    Efficient generation of neural stem cell-like cells from adult human bone marrow stromal cells

    Get PDF
    Clonogenic neural stem cells (NSCs) are self-renewing cells that maintain the capacity to differentiate into brain-specific cell types, and may also replace or repair diseased brain tissue. NSCs can be directly isolated from fetal or adult nervous tissue, or derived from embryonic stem cells. Here, we describe the efficient conversion of human adult bone marrow stromal cells (hMSC) into a neural stem cell-like population (hmNSC, for human marrow-derived NSC-like cells). These cells grow in neurosphere-like structures, express high levels of early neuroectodermal markers, such as the proneural genes NeuroD1, Neurog2, MSl1 as well as otx1 and nestin, but lose the characteristics of mesodermal stromal cells. In the presence of selected growth factors, hmNSCs can be differentiated into the three main neural phenotypes: astroglia, oligodendroglia and neurons. Clonal analysis demonstrates that individual hmNSCs are multipotent and retain the capacity to generate both glia and neurons. Our cell culture system provides a powerful tool for investigating the molecular mechanisms of neural differentiation in adult human NSCs. hmNSCs may therefore ultimately help to treat acute and chronic neurodegenerative diseases

    Particle Physics Implications for CoGeNT, DAMA, and Fermi

    Full text link
    Recent results from the CoGeNT collaboration (as well as the annual modulation reported by DAMA/LIBRA) point toward dark matter with a light (5-10 GeV) mass and a relatively large elastic scattering cross section with nucleons (\sigma ~ 10^{-40} cm^2). In order to possess this cross section, the dark matter must communicate with the Standard Model through mediating particles with small masses and/or large couplings. In this Letter, we explore with a model independent approach the particle physics scenarios that could potentially accommodate these signals. We also discuss how such models could produce the gamma rays from the Galactic Center observed in the data of the Fermi Gamma Ray Space Telescope. We find multiple particle physics scenarios in which each of these signals can be accounted for, and in which the dark matter can be produced thermally in the early Universe with an abundance equal to the measured cosmological density.Comment: 4 pages, 2 figure

    Noise Regularization for Conditional Density Estimation

    Full text link
    Modelling statistical relationships beyond the conditional mean is crucial in many settings. Conditional density estimation (CDE) aims to learn the full conditional probability density from data. Though highly expressive, neural network based CDE models can suffer from severe over-fitting when trained with the maximum likelihood objective. Due to the inherent structure of such models, classical regularization approaches in the parameter space are rendered ineffective. To address this issue, we develop a model-agnostic noise regularization method for CDE that adds random perturbations to the data during training. We demonstrate that the proposed approach corresponds to a smoothness regularization and prove its asymptotic consistency. In our experiments, noise regularization significantly and consistently outperforms other regularization methods across seven data sets and three CDE models. The effectiveness of noise regularization makes neural network based CDE the preferable method over previous non- and semi-parametric approaches, even when training data is scarce
    • ā€¦
    corecore