740 research outputs found

    Keeping watch over Colombia’s slumbering volcanoes

    Get PDF
    The Volcanological and Seismological Observatories of Manizales, Pasto and Popayan (Colombian Geological Survey) monitor and study the active volcanoes of Colombia using seismological, geodetic, geochemical and other techniques. Since 2009, permanent GNSS stations have been installed to complement classical geodetic measurements (e.g., tilt, EDM). At the moment, there are a total of 20 GNSS stations installed at Nevado del Ruiz, Cerro Machín, Puracé and Galeras volcanoes. Nevado del Ruiz has remained the most dynamic of the active Colombian volcanoes since its tragic eruption of 13 November 1985. The most significant deformation occurred between 2007 and 2012, when inflation, associated with magma migration and several small to moderate explosive eruptions in 2012 (VEI less or equal to 3), was observed. Galeras has experienced more than 25 moderate Vulcanian eruptions (VEI less or equal to 3) since 1989. In particular, the deformation network detected significant signals associated with magma migration and the extrusion of lava domes in 1991, 2005, 2008 and 2012. Puracé volcano has been the site of more than 10 minor eruptive episodes (VEI=2) in the past century, most recently in 1977. Monitoring of this volcano started in 1994. Unrest at Puracé since that time has been characterized by significant increases in seismic activity but with little or no deformation. We employ GAMIT/GLOBK to process GPS data from the monitoring network with support from the Volcano Disaster Assistance Program (U.S. Geological Survey). Additionally, differential processing is carried out using the commercial package Trimble 4D Control. Preliminary results for 2012 show no significant deformation at Puracé and Galeras volcanoes. On the other hand, the time series from Nevado del Ruiz shows a minor inflation (2-4 cm/yr) associated with the eruptive activity of 2012

    Spatial and temporal analysis of the seasonal and interannual variability in the tropical Pacific simulated with a coupled GCM

    Get PDF
    In the first part of this work, the dominant time scales that explain the tropical variability of the first SINTEX simulation (ECHAM4(T30)-ORCA) are identified through a spectral analysis. Higher order spectral analysis is used to examine the interactions among these time scales. The time series analyzed are an average of sea surface temperature over the Niño3 region. The time scales obtained are compared with those identified in another coupled GCM simulation (ECHAM4(T42)-OPYC3). The higher importance of the biannual time scale in this last is explained partly by the strength of the coupling between the annual and the biannual time scales. There is no such strong coupling in the SINTEX simulation. Important differences among the generation of the simulated warm (or cold) event suggest the need of a systematic classification to isolate their relevant features. Therefore in the second part of this work, we address this problem. A space-time cluster analysis is performed on a data set built by collecting the values of the heat content anomalies in the tropical Pacific region, in the fifteen months previous to a peak in the Niño3 Index that has been identified as a ‘warm’ (or ‘cold’) event. In the case of the warm events, three types of generation schemes are found. In two of them, there are anomalies of heat content in the west, north and south of the equator, more than nine months before the events start. In the third case, the anomalies appear and grow in the central equatorial Pacific. Only two types are needed to classify the generation of cold events. Negative sea level height anomalies appear six months before the Niño3 Index reaches the (local) minimum. They are located north of the equator in one of the groups, and south of it in the other. Some of these characteristic traits also appear in observations of warm and cold events

    Quantitative analysis of numerical estimates for the permeability of porous media from lattice-Boltzmann simulations

    Full text link
    During the last decade, lattice-Boltzmann (LB) simulations have been improved to become an efficient tool for determining the permeability of porous media samples. However, well known improvements of the original algorithm are often not implemented. These include for example multirelaxation time schemes or improved boundary conditions, as well as different possibilities to impose a pressure gradient. This paper shows that a significant difference of the calculated permeabilities can be found unless one uses a carefully selected setup. We present a detailed discussion of possible simulation setups and quantitative studies of the influence of simulation parameters. We illustrate our results by applying the algorithm to a Fontainebleau sandstone and by comparing our benchmark studies to other numerical permeability measurements in the literature.Comment: 14 pages, 11 figure

    Galaxy properties from J-PAS narrow-band photometry

    Full text link
    We study the consistency of the physical properties of galaxies retrieved from SED-fitting as a function of spectral resolution and signal-to-noise ratio (SNR). Using a selection of physically motivated star formation histories, we set up a control sample of mock galaxy spectra representing observations of the local universe in high-resolution spectroscopy, and in 56 narrow-band and 5 broad-band photometry. We fit the SEDs at these spectral resolutions and compute their corresponding the stellar mass, the mass- and luminosity-weighted age and metallicity, and the dust extinction. We study the biases, correlations, and degeneracies affecting the retrieved parameters and explore the r\^ole of the spectral resolution and the SNR in regulating these degeneracies. We find that narrow-band photometry and spectroscopy yield similar trends in the physical properties derived, the former being considerably more precise. Using a galaxy sample from the SDSS, we compare more realistically the results obtained from high-resolution and narrow-band SEDs (synthesized from the same SDSS spectra) following the same spectral fitting procedures. We use results from the literature as a benchmark to our spectroscopic estimates and show that the prior PDFs, commonly adopted in parametric methods, may introduce biases not accounted for in a Bayesian framework. We conclude that narrow-band photometry yields the same trend in the age-metallicity relation in the literature, provided it is affected by the same biases as spectroscopy; albeit the precision achieved with the latter is generally twice as large as with the narrow-band, at SNR values typical of the different kinds of data.Comment: 26 pages, 15 figures. Accepted for publication in MNRA

    Quantification of the performance of chaotic micromixers on the basis of finite time Lyapunov exponents

    Get PDF
    Chaotic micromixers such as the staggered herringbone mixer developed by Stroock et al. allow efficient mixing of fluids even at low Reynolds number by repeated stretching and folding of the fluid interfaces. The ability of the fluid to mix well depends on the rate at which "chaotic advection" occurs in the mixer. An optimization of mixer geometries is a non trivial task which is often performed by time consuming and expensive trial and error experiments. In this paper an algorithm is presented that applies the concept of finite-time Lyapunov exponents to obtain a quantitative measure of the chaotic advection of the flow and hence the performance of micromixers. By performing lattice Boltzmann simulations of the flow inside a mixer geometry, introducing massless and non-interacting tracer particles and following their trajectories the finite time Lyapunov exponents can be calculated. The applicability of the method is demonstrated by a comparison of the improved geometrical structure of the staggered herringbone mixer with available literature data.Comment: 9 pages, 8 figure

    Lack of replication of interactions between polymorphisms in rheumatoid arthritis susceptibility: case-control study

    Get PDF
    Introduction: Approximately 100 loci have been definitively associated with rheumatoid arthritis (RA) susceptibility. However, they explain only a fraction of RA heritability. Interactions between polymorphisms could explain part of the remaining heritability. Multiple interactions have been reported, but only the shared epitope (SE) × protein tyrosine phosphatase nonreceptor type 22 (PTPN22) interaction has been replicated convincingly. Two recent studies deserve attention because of their quality, including their replication in a second sample collection. In one of them, researchers identified interactions between PTPN22 and seven single-nucleotide polymorphisms (SNPs). The other showed interactions between the SE and the null genotype of glutathione S-transferase Mu 1 (GSTM1) in the anti-cyclic citrullinated peptide-positive (anti-CCP+) patients. In the present study, we aimed to replicate association with RA susceptibility of interactions described in these two high-quality studies. Methods: A total of 1,744 patients with RA and 1,650 healthy controls of Spanish ancestry were studied. Polymorphisms were genotyped by single-base extension. SE genotypes of 736 patients were available from previous studies. Interaction analysis was done using multiple methods, including those originally reported and the most powerful methods described. Results: Genotypes of one of the SNPs (rs4695888) failed quality control tests. The call rate for the other eight polymorphisms was 99.9%. The frequencies of the polymorphisms were similar in RA patients and controls, except for PTPN22 SNP. None of the interactions between PTPN22 SNPs and the six SNPs that met quality control tests was replicated as a significant interaction term the originally reported finding or with any of the other methods. Nor was the interaction between GSTM1 and the SE replicated as a departure from additivity in anti-CCP+ patients or with any of the other methods. Conclusions: None of the interactions tested were replicated in spite of sufficient power and assessment with different assays. These negative results indicate that whether interactions are significant contributors to RA susceptibility remains unknown and that strict standards need to be applied to claim that an interaction exists

    Mathematical and physical techniques of modeling and simulation of pattern recognition in the stock market

    Get PDF
    The following article presents the analysis through mathematical and physical techniques of large databases, which are very common today, due to the large number of variables (especially in the information and physics industry) and the amount of information that results from a process, therefore an analysis is necessary that allows the Decision in a responsible manner, looking for scientific criteria that support said decisions, in our case a database of the forex system will be taken. Initially, a study and calculation of different measurements between the samples and their characteristics will be carried out to make a good prediction of the data and their behavior using different classification methods inspired by basic sciences. Below is an explanation of the techniques based on the analysis of data components and the correlations that exist between the variables, which is a technique widely used in physical processes to determine the correlations between variables

    Unlocking legal validity. Some remarks on the artificial ontology of law

    Get PDF
    Following Kelsen’s influential theory of law, the concept of validity has been used in the literature to refer to different properties of law (such as existence, membership, bindingness, and more) and so it is inherently ambiguous. More importantly, Kelsen’s equivalence between the existence and the validity of law prevents us from accounting satisfactorily for relevant aspects of our current legal practices, such as the phenomenon of ‘unlawful law’. This chapter addresses this ambiguity to argue that the most important function of the concept of validity is constituting the complex ontological paradigm of modern law as an institutional-normative practice. In this sense validity is an artificial ontological status that supervenes on that of existence of legal norms, thus allowing law to regulate its own creation and creating the logical space for the occurrence of ‘unlawful law’. This function, I argue in the last part, is crucial to understanding the relationship between the ontological and epistemic dimensions of the objectivity of law. For given the necessary practice-independence of legal norms, it is the epistemic accessibility of their creation that enables the law to fulfill its general action-guiding (and thus coordinating) function
    corecore