45 research outputs found

    Psicología social del trabajo y renta básica universal: Renovando la comprensión de la precariedad laboral

    Get PDF
    La sociedad tecnológica avanzada supone un cambio profundo en el mercado laboral, las dinámicas de relación entre la población con las organizaciones laborales, y, todo ello, condiciona el plano psicológico actual. Este artículo reflexiona sobre dos elementos interactivos: primero, sobre las aportaciones, implicaciones e impactos que la psicosociología del trabajo, como ámbito de la psicosociología, ha tenido y tiene sobre las dinámicas laborales y sociales descritas. El segundo, ubica la renta básica universal como elemento de estudio en la psicología social, asumiendo que esta política presenta la potencialidad de transformar la relación persona-mundo de una forma radical, y con ello cuestiona el actual ordenamiento laboral. Ante este contexto, la renta básica universal se presenta como elemento de estudio fundamental en la psicosociología, para aportar nuevos prismas de análisis en la disciplina

    Sociedad naturaleza y turismo

    Get PDF
    Naturaleza y turismo parecen el pasado, pero todo es relativo ya que la naturaleza como tal no se aplica directamente en el turismo, este la transforma, la hace más amable y menos riesgosa, por ello hay quienes han planteado lo opuesto: diseñarla desde el inicio para crear un producto turístico a la medida. Sociedad naturaleza y turismo contienen una colección de ensayos presentados en el IV Seminario Internacional Nuevas Alternativas del Turismo, por un grupo de expertos en las ciencias sociales que plantean un conjunto de ideas sobre la creación y transformación de producto turístico

    Chromatin regulation by Histone H4 acetylation at Lysine 16 during cell death and differentiation in the myeloid compartment

    Full text link

    Mass Reproducibility and Replicability: A New Hope

    Full text link
    This study pushes our understanding of research reliability by reproducing and replicating claims from 110 papers in leading economic and political science journals. The analysis involves computational reproducibility checks and robustness assessments. It reveals several patterns. First, we uncover a high rate of fully computationally reproducible results (over 85%). Second, excluding minor issues like missing packages or broken pathways, we uncover coding errors for about 25% of studies, with some studies containing multiple errors. Third, we test the robustness of the results to 5,511 re-analyses. We find a robustness reproducibility of about 70%. Robustness reproducibility rates are relatively higher for re-analyses that introduce new data and lower for re-analyses that change the sample or the definition of the dependent variable. Fourth, 52% of re-analysis effect size estimates are smaller than the original published estimates and the average statistical significance of a re-analysis is 77% of the original. Lastly, we rely on six teams of researchers working independently to answer eight additional research questions on the determinants of robustness reproducibility. Most teams find a negative relationship between replicators' experience and reproducibility, while finding no relationship between reproducibility and the provision of intermediate or even raw data combined with the necessary cleaning codes

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    Full text link
    International audienceThe Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/cc charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1±0.6\pm0.6% and 84.1±0.6\pm0.6%, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation

    Separation of track- and shower-like energy deposits in ProtoDUNE-SP using a convolutional neural network

    Full text link
    International audienceLiquid argon time projection chamber detector technology provides high spatial and calorimetric resolutions on the charged particles traversing liquid argon. As a result, the technology has been used in a number of recent neutrino experiments, and is the technology of choice for the Deep Underground Neutrino Experiment (DUNE). In order to perform high precision measurements of neutrinos in the detector, final state particles need to be effectively identified, and their energy accurately reconstructed. This article proposes an algorithm based on a convolutional neural network to perform the classification of energy deposits and reconstructed particles as track-like or arising from electromagnetic cascades. Results from testing the algorithm on experimental data from ProtoDUNE-SP, a prototype of the DUNE far detector, are presented. The network identifies track- and shower-like particles, as well as Michel electrons, with high efficiency. The performance of the algorithm is consistent between experimental data and simulation

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Full text link
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    Full text link
    International audienceThe Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/cc charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1±0.6\pm0.6% and 84.1±0.6\pm0.6%, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation

    DUNE Offline Computing Conceptual Design Report

    Full text link
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment
    corecore