27 research outputs found

    Proyecto Educativo de Centro : primer borrador

    Full text link
    En esta publicación se recogen un conjunto de conceptos, reflexiones y pautas de actuación para la aplicación y revisión del Proyecto Educativo de Centro (PEC). Son sus partes: 1. Introducción. 2. Reflexiones. 2.1. Escuela y planteamientos institucionales. 3. El Proyecto Educativo de Centro. 3.1. Concepto. 3.2. El PEC y el desarrollo curricular. 4. Condicionamientos al PEC. 4.1. Preceptos legales. 4.2. La situación socioeconómica y cultural de la zona en la que se ubica el centro. 4.3. La tipología escolar. 4.4. Los indicadores de la estructura y funcionamiento del centro. 5. Apartados que configuran el PEC. 5.1. Las señas de identidad. 5.2. Formulación de objetivos. 5.3. Concreción de una estructura - propuesta de organización. 6. Créditos.CantabriaES

    Analysis of pallial/cortical interneurons in key vertebrate models of Testudines, Anurans and Polypteriform fishes

    Full text link

    Mortality after surgery in Europe: a 7 day cohort study

    Get PDF
    Background: Clinical outcomes after major surgery are poorly described at the national level. Evidence of heterogeneity between hospitals and health-care systems suggests potential to improve care for patients but this potential remains unconfirmed. The European Surgical Outcomes Study was an international study designed to assess outcomes after non-cardiac surgery in Europe.Methods: We did this 7 day cohort study between April 4 and April 11, 2011. We collected data describing consecutive patients aged 16 years and older undergoing inpatient non-cardiac surgery in 498 hospitals across 28 European nations. Patients were followed up for a maximum of 60 days. The primary endpoint was in-hospital mortality. Secondary outcome measures were duration of hospital stay and admission to critical care. We used χ² and Fisher’s exact tests to compare categorical variables and the t test or the Mann-Whitney U test to compare continuous variables. Significance was set at p<0·05. We constructed multilevel logistic regression models to adjust for the differences in mortality rates between countries.Findings: We included 46 539 patients, of whom 1855 (4%) died before hospital discharge. 3599 (8%) patients were admitted to critical care after surgery with a median length of stay of 1·2 days (IQR 0·9–3·6). 1358 (73%) patients who died were not admitted to critical care at any stage after surgery. Crude mortality rates varied widely between countries (from 1·2% [95% CI 0·0–3·0] for Iceland to 21·5% [16·9–26·2] for Latvia). After adjustment for confounding variables, important differences remained between countries when compared with the UK, the country with the largest dataset (OR range from 0·44 [95% CI 0·19 1·05; p=0·06] for Finland to 6·92 [2·37–20·27; p=0·0004] for Poland).Interpretation: The mortality rate for patients undergoing inpatient non-cardiac surgery was higher than anticipated. Variations in mortality between countries suggest the need for national and international strategies to improve care for this group of patients.Funding: European Society of Intensive Care Medicine, European Society of Anaesthesiology

    Mortality after surgery in Europe: a 7 day cohort study.

    Full text link

    NEOTROPICAL CARNIVORES: a data set on carnivore distribution in the Neotropics

    Full text link
    Mammalian carnivores are considered a key group in maintaining ecological health and can indicate potential ecological integrity in landscapes where they occur. Carnivores also hold high conservation value and their habitat requirements can guide management and conservation plans. The order Carnivora has 84 species from 8 families in the Neotropical region: Canidae; Felidae; Mephitidae; Mustelidae; Otariidae; Phocidae; Procyonidae; and Ursidae. Herein, we include published and unpublished data on native terrestrial Neotropical carnivores (Canidae; Felidae; Mephitidae; Mustelidae; Procyonidae; and Ursidae). NEOTROPICAL CARNIVORES is a publicly available data set that includes 99,605 data entries from 35,511 unique georeferenced coordinates. Detection/non-detection and quantitative data were obtained from 1818 to 2018 by researchers, governmental agencies, non-governmental organizations, and private consultants. Data were collected using several methods including camera trapping, museum collections, roadkill, line transect, and opportunistic records. Literature (peer-reviewed and grey literature) from Portuguese, Spanish and English were incorporated in this compilation. Most of the data set consists of detection data entries (n = 79,343; 79.7%) but also includes non-detection data (n = 20,262; 20.3%). Of those, 43.3% also include count data (n = 43,151). The information available in NEOTROPICAL CARNIVORES will contribute to macroecological, ecological, and conservation questions in multiple spatio-temporal perspectives. As carnivores play key roles in trophic interactions, a better understanding of their distribution and habitat requirements are essential to establish conservation management plans and safeguard the future ecological health of Neotropical ecosystems. Our data paper, combined with other large-scale data sets, has great potential to clarify species distribution and related ecological processes within the Neotropics. There are no copyright restrictions and no restriction for using data from this data paper, as long as the data paper is cited as the source of the information used. We also request that users inform us of how they intend to use the data

    reseña del libro Paremias e indumentaria en Refranes y Proverbios en Romance (1555) de Hernán Núñez. Análisis paremiológico, etnolingüístico y lingüístico

    Full text link

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)

    Full text link
    In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Full text link
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    DUNE Offline Computing Conceptual Design Report

    Full text link
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    Full text link
    International audienceThis document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment
    corecore