1,355 research outputs found

    Sistemas caóticos y su aplicación a la encriptación de señales

    Get PDF
    La sincronización y control de señales caóticas es una activa área de investigación por sus posibles aplicaciones en telecomunicaciones y transmisión de señales [1, 2, 3, 4]. En el presente trabajo se estudia un sistema de comunicación basado en la sincronización de dos sistemas no lineales caóticos, cada uno modelado a partir de las ecuaciones de movimiento de un péndulo forzado amortiguado y que se encuentran en el mismo punto de operación del espacio de parámetros. Synchronization and control of chaotic signals is an active research area because of its applications in telecommunications and secure signal transmission [1,2,3,4]. In this work a communication system based in the synchronization of two chaotic nonlinear systems, each one being modeled by the motion equations of a driven damped pendulum and operated in the same parameter space region is shown. Two communication channels were used: the first one for the synchronizing signal and the second one for the sent message. By using two channels the initial conditions sensibility problem is solved. In the receiver system a feedback loop as a proportional controller is used in order to drive quickly the error between the decoder and encoder states to zero. The last two facts make the system to be robust to external pertubative signals such as noise in the communication channels

    Rapid bottom-water circulation changes during the last glacial cycle in the coastal low-latitude NE Atlantic

    Get PDF
    Previous paleoceanographic studies along the NW African margin focused on the dynamics of surface and intermediate waters, whereas little attention has been devoted to deep-water masses. Currently, these deep waters consist mainly of North Atlantic Deep Waters as part of the Atlantic Meridional Overturning Circulation (AMOC). However, this configuration was altered during periods of AMOC collapse. We present a highresolution reconstruction of bottom-water ventilation and current evolution off Mauritania from the last glacial maximum into the early Holocene. Applying redox proxies (Mo, U and Mn) measured on sediments from off Mauritania, we describe changes in deep-water oxygenation and we infer the evolution of deep-water conditions during millennial-scale climate/oceanographic events in the area. The second half of Heinrich Event 1 and the Younger Dryas were recognized as periods of reduced ventilation, coinciding with events of AMOC reduction. We propose that these weakening circulation events induced deficient deep-water oxygenation in the Mauritanian upwelling region, which together with increased productivity promoted reducing conditions and enhanced organic-matter preservation. This is the first time the effect of AMOC collapse in the area is described at high resolution, broadening the knowledge on basin-wide oceanographic changes associated with rapid climate variability during the last deglaciation

    Experimental estimation of the dimension of classical and quantum systems

    Full text link
    An overwhelming majority of experiments in classical and quantum physics make a priori assumptions about the dimension of the system under consideration. However, would it be possible to assess the dimension of a completely unknown system only from the results of measurements performed on it, without any extra assumption? The concept of a dimension witness answers this question, as it allows one to bound the dimension of an unknown classical or quantum system in a device-independent manner, that is, only from the statistics of measurements performed on it. Here, we report on the experimental demonstration of dimension witnesses in a prepare and measure scenario. We use pairs of photons entangled in both polarization and orbital angular momentum to generate ensembles of classical and quantum states of dimensions up to 4. We then use a dimension witness to certify their dimensionality as well as their quantum nature. Our results open new avenues for the device-independent estimation of unknown quantum systems and for applications in quantum information science.Comment: See also similar, independent and jointly submitted work of J. Ahrens et al., quant-ph/1111.127

    Processing and characterisation of cermet/hardmetal laminates with strong interfaces

    Get PDF
    Cemented carbides and cermets are potential materials for high speed machining tools. However, cemented carbides are not chemically stable at high temperature and cermets present poor fracture toughness. Novel cermet/hardmetal multilayer systems show a huge potential for this intended application. It would be possible to achieve the right balance of the required thermomechanical properties using cermet as temperature protective outer layers and hardmetal as reinforcement layers. In this work, preliminary results on the microstructural and mechanical characterisation of a multilayer TiCxN1-x-Co/WC-Co composite densified by hot pressing are presented, with special attention to the properties of the interface. Microstructural observations revealed the existence of strong bonding interfaces between cermet and hardmetal layers due to chemical interaction during the sintering process. As a consequence, owing to the different coefficient of thermal expansion between cermet and hardmetal, a tensile and compressive biaxial residual stress of σres,Cermet≈+260±50MPa and σres,WC-Co≈-350±70MPa was estimated in the corresponding layers. Microindentation cracks introduced in the cermet layers (the less toughness material) and propagated transversely to the layers were arrested at the interface, showing the combined effect of toughness and compressive stresses on crack shielding.Gobierno de España No. MAT2011-22981Junta de Andalucía No. P12-TEP-262

    The OTELO survey. A case study of [O III]4959,5007 emitters at <z> = 0.83

    Full text link
    The OTELO survey is a very deep, blind exploration of a selected region of the Extended Groth Strip and is designed for finding emission-line sources (ELSs). The survey design, observations, data reduction, astrometry, and photometry, as well as the correlation with ancillary data used to obtain a final catalogue, including photo-z estimates and a preliminary selection of ELS, were described in a previous contribution. Here, we aim to determine the main properties and luminosity function (LF) of the [O III] ELS sample of OTELO as a scientific demonstration of its capabilities, advantages, and complementarity with respect to other surveys. The selection and analysis procedures of ELS candidates obtained using tunable filter (TF) pseudo-spectra are described. We performed simulations in the parameter space of the survey to obtain emission-line detection probabilities. Relevant characteristics of [O III] emitters and the LF([O III]), including the main selection biases and uncertainties, are presented. A total of 184 sources were confirmed as [O III] emitters at a mean redshift z=0.83. The minimum detectable line flux and equivalent width (EW) in this ELS sample are \sim5 ×\times 1019^{-19} erg s1^{-1} cm2^{2} and \sim6 \AA, respectively. We are able to constrain the faint-end slope (α=1.03±0.08\alpha = -1.03\pm0.08) of the observed LF([O III]) at z=0.83. This LF reaches values that are approximately ten times lower than those from other surveys. The vast majority (84\%) of the morphologically classified [O III] ELSs are disc-like sources, and 87\% of this sample is comprised of galaxies with stellar masses of M_\star << 1010^{10} M_{\odot}.Comment: v1: 16 pages, 6 figures. Accepted in Astronomy \& Astrophysics. v2: Author added in metadat

    Big Data: Un nuevo problema computacional

    Get PDF
    El aumento de la capacidad de procesamiento en los computadores permite llevar a cabo tareas que hasta ahora no eran viables: simulación de procesos naturales, almacenamiento de datos geográficos, económicos, multimedia, inteligencia social, etc. Como contrapartida, el volumen de datos que se genera en este tipo de aplicaciones puede llegar a provocar que el coste computacional para su trata miento y análisis, con las herramientas actuales, sea tan alto que vuelva a convertirse en un prob lema inabordable. Este problema es conocido por el término ‘Big Data’. Según afirma IBM en un estudio reciente: “Se pro ducen más de 2,5 quintillones de bytes al día, hasta el punto de que el 90% de los datos del mundo han sido creados durante los 2 últimos años”. De seguir con este ritmo de crecimiento en breve se generará más volumen de datos del que se puede analizar, disminuyendo su utilidad al tiempo que el coste y el riesgo de pérdida de información aumenta. El problema se acrecienta todavía más en aplica ciones con tratamiento de información en tiempo real, donde el valor de los datos reside en la actual idad de los mismos. En este tipo de aplicaciones una gestión y tratamiento eficientes de los datos es de vital importancia. Las dificultades más habituales en el tratamiento radican en la captura, análisis, almacenamiento, búsqueda, compartición, y visualización. Las téc nicas existentes en la actualidad para abordar este problema no son efectivas, por lo que el problema ‘Big Data’ sigue abierto. Este problema tiene una gran repercusión en diver sos ámbitos: redes sociales, aplicaciones de video, dispositivos móviles, webs, laboratorios astrofísi cos, simulación científica, captura de datos de clientes (compañías eléctricas, telefónicas, etc). or tanto, los beneficios potenciales de un trata miento y gestión eficientes de grandes volúmenes de datos, y su gran aplicabilidad sobre múltiples campos, hace que sea un tema muy atractivo, en el que actualmente trabajan numerosas empresas y grupos de investigación. Desde el punto de vista de la investigación es un problema relativamente reciente, por lo que el de sarrollo de soluciones y la bibliografía son todavía escasos en comparación con otros campos. En este artículo se abordarán las técnicas utilizadas ac tualmente y las principales líneas que se siguen pa ra desarrollar soluciones futuras

    Galaxy classification: deep learning on the OTELO and COSMOS databases

    Get PDF
    Context. The accurate classification of hundreds of thousands of galaxies observed in modern deep surveys is imperative if we want to understand the universe and its evolution. Aims. Here, we report the use of machine learning techniques to classify early- and late-type galaxies in the OTELO and COSMOS databases using optical and infrared photometry and available shape parameters: either the Sersic index or the concentration index. Methods. We used three classification methods for the OTELO database: 1) u-r color separation , 2) linear discriminant analysis using u-r and a shape parameter classification, and 3) a deep neural network using the r magnitude, several colors, and a shape parameter. We analyzed the performance of each method by sample bootstrapping and tested the performance of our neural network architecture using COSMOS data. Results. The accuracy achieved by the deep neural network is greater than that of the other classification methods, and it can also operate with missing data. Our neural network architecture is able to classify both OTELO and COSMOS datasets regardless of small differences in the photometric bands used in each catalog. Conclusions. In this study we show that the use of deep neural networks is a robust method to mine the cataloged dataComment: 20 pages, 10 tables, 14 figures, Astronomy and Astrophysics (in press

    Measurement of the broadband complex permittivity of soils in the frequency domain with a low-cost vector network analyzer and an open-ended coaxial probe

    Get PDF
    The performance of a handheld Vector Network Analyzer (VNA), the nanoVNA, a low-cost, open-source instrument, was evaluated. The instrument measures the complex permittivity of dielectric media from 1-port reflection parameters in the 1 – 900 MHz bandwidth. We manufactured an open-ended coaxial probe using a SMA-N coaxial adapter to perform dielectric measurements. The accuracy of the nanoVNA was comparable to that of a commercial VNA between 1 and 500 MHz according to tests in reference organic liquids, while a lack of stability was found beyond 700 MHz. The self-manufactured open-ended coaxial probe was subjected to a Finite Element Method (FEM) analysis and its electromagnetic (EM) field penetration depth was determined to be 1.5 mm at 100 MHz, being reduced to 1.3 at 900 MHz and thus demonstrating a frequency-dependent support volume. The broadband complex permittivity of three mineral soils of varied textures was obtained for a range of bulk densities and water contents from dry to water-saturated conditions. The dielectric response of the soils approximated the well-known Topp et al. (1980) equation at high frequencies. At lower frequency however, higher permittivities were exhibited due to dielectric dispersion, which emphasizes the importance of EM-based soil moisture sensor operating frequency when considering sensor calibration or comparing the response of different sensors

    Development of a LAMP assay for detection of Leishmania infantum infection in dogs using conjunctival swab samples

    Get PDF
    Background: Leishmania infantum infections in dogs play a crucial role in the transmission of pathogens causing visceral leishmaniasis to humans in the Gansu province, northwest China. To be able to control zoonotic transmission of the parasite to humans, a non-invasive loop-mediated isothermal amplification (LAMP) assay to specifically detect L. infantum infections in dogs was developed. Methods: The primers used in the LAMP assay were designed to target kinetoplast DNA minicircle sequences of the L. infantum isolate MCAN/CN/90/SC and tested using DNA isolated from promastigotes of different Leishmania species. The LAMP assay was evaluated with conjunctional swab samples obtained from 111 and 33 dogs living in an endemic and a non-endemic region of zoonotic visceral leishmaniasis in the Gansu province, respectively. The LAMP assay was also compared with conventional PCR, ELISA and microscopy using conjunctional swab, serum and bone marrow samples from the dogs, respectively. Results: The LAMP assay detected 1 fg of L. infantum DNA purified from cultured promastigotes which was 10-fold more sensitive than a conventional PCR test using Leishmania genus-specific primers. No cross reaction was observed with DNA isolated from promastigotes of L. donovani, L. major, L. tropica, and L. braziliensis, and the L. infantum reference strain MHOM/TN/80/IPT1. The L. infantum-positive rates obtained for field-collected samples were 61.3%, 58.6%, 40.5% and 10.8% by LAMP, PCR, ELISA and microscopy, respectively. As only one out of the 33 samples from control dogs from the non-endemic region of zoonotic visceral leishmaniasis was positive by the LAMP assay and the PCR test, the observed true negative rate (specificity) was 97% for both methods. Conclusion: This study has shown that the non-invasive, conjunctional swab-based LAMP assay developed was more sensitive in the detection of leishmaniasis in dogs than PCR, ELISA and microscopy. The findings indicate that the LAMP assay is a sensitive and specific method for the field surveillance of domestic dogs, particularly of asymptomatic canines, in ZVL-endemic areas in western China

    LeishVet guidelines for the practical management of canine leishmaniosis

    Get PDF
    The LeishVet group has formed recommendations designed primarily to help the veterinary clinician in the management of canine leishmaniosis. The complexity of this zoonotic infection and the wide range of its clinical manifestations, from inapparent infection to severe disease, make the management of canine leishmaniosis challenging. The recommendations were constructed by combining a comprehensive review of evidence-based studies, extensive clinical experience and critical consensus opinion discussions. The guidelines presented here in a short version with graphical topic displays suggest standardized and rational approaches to the diagnosis, treatment, follow-up, control and prevention of canine leishmaniosis. A staging system that divides the disease into four stages is aimed at assisting the clinician in determining the appropriate therapy, forecasting prognosis, and implementing follow-up steps required for the management of the leishmaniosis patient
    corecore