3,024 research outputs found
Cardiac damage biomarkers and heart rate variability following a 118-km mountain race: relationship with performance and recovery
This study aimed to assess the release of cardiac damage biomarkers jointly with cardiac autonomic modulation after a mountain ultramarathon. Such knowledge and the possible relationship of these markers with race time is of primary interest to establish possible recommendations upon athletes’ recovery and return to training following these competitions. Forty six athletes enrolled in the Penyagolosa Trails CSP115 race (118 km and a total positive elevation of 5439 m) took part in the study. N-terminal pro-brain natriuretic peptide (NT-proBNP) and high-sensitive cardiac troponin T (hs-TNT) concentrations as well as linear and nonlinear heart rate variability (HRV) were evaluated before and after the race. NT-proBNP and hs-TNT significantly increased post-race; fifty percent of the finishers surpassed the Upper Reference Limit (URL) for hs-TNT while 87% exceeded the URL for NT-proBNP. Overall and vagally-mediated HRV were diminished and cardiac autonomic modulation became less complex and more predictable following the race. More pronounced vagal modulation decreases were associated with higher levels of postexertional NT-proBNP. Moreover, rise in hs-TNT and NT-proBNP was greater among faster runners, while pre-race overall and vagally-mediated HRV were correlated with finishing time. Participation in a 118-km ultratrail induces an acute release of cardiac damage biomarkers and a large alteration of cardiac autonomic modulation. Furthermore, faster runners were those who exhibited a greater rise in those cardiac damage biomarkers. In light of these findings, an appropriate recovery period after ultraendurance races appears prudent and particularly important among better performing athletes. At the same time, HRV analysis is shown as a promising tool to assess athletes’ readiness to perform at their maximum level in an ultraendurance race
The validity of Rhetorical categories in audiovisual culture
Los ámbitos de la comunicación, la sociedad y el arte han evolucionado como
consecuencia de los cambios culturales y del progreso de las nuevas tecnologías. Sin
embargo, en la cultura audiovisual dominante hoy, la Retórica clásica sigue vigente y ha
sabido adaptarse a los nuevos modelos de comunicación que integran elementos visuales,
lingüísticos y acústicos que pueden ser analizados y entendidos a través de la Retórica
CulturalCommunication, Society, and Art evolve as a consequence of cultural change
and new technologies’ development. However, in current audiovisual mainstream Culture,
Classical Rhetoric is still useful. It adapts itself to new communicative models
incorporating visual, linguistic, and acoustic elements, which can be studied and understood
through Cultural RhetoricEste trabajo es resultado de la investigación realizada en el proyecto METAPHORA (Referencia FFI2014-53391-P), proyecto de investigación financiado por la Secretaría de Estado de Investigación, Desarrollo e Innovació
On the calculation of the Kalker's creep coefficients for non-elliptical contact areas
[EN] FastSim is the most widely used tangential contact method due to its accuracy
and computational efficiency. However, its use is limited to elliptic contact areas, as it needs
results from Kalker’s Linear Theory, a Hertzian contact theory, to obtain the so-called elastic
parameters. This makes FastSim unable to face some of the current railway challenges, such as
wear, corrugation, Rolling Contact Fatigue (RCF), wheel flats, etc. Taking this limitation into
account, in the present work, an alternative methodology to Kalker’s Linear Theory is proposed,
which will enable FastSim to deal with non-Hertzian conditions.The authors gratefully acknowledge the financial support of Agencia Estatal de Investigación and European Regional Development Fund (grant PRE2018-084067 and project TRA2017-84701-R).Gómez-Bosch, J.; Giner-Navarro, J.; Carballeira, J. (2022). On the calculation of the Kalker's creep coefficients for non-elliptical contact areas. En Proceedings of the YIC 2021 - VI ECCOMAS Young Investigators Conference. Editorial Universitat Politècnica de València. 288-294. https://doi.org/10.4995/YIC2021.2021.12313OCS28829
Servidor de Logs Centralizado
Mi proyecto consiste en un servidor centralizado de logs, es decir un equipo capaz de
recoger, almacenar y mostrar los distintos ‘eventos’ o logs generados por otros equipos
para así poder llevar un seguimiento del funcionamiento general de los servicios o analizar posibles incidencias.
Para la realización del proyecto cuento con un equipo con GNU Linux Ubuntu Server 12 sobre el cual configuré un servidor web Apache con soporte PHP y MYSQL para la creación y manipulación de bases de datos.
En dicho equipo se instala la herramienta Rsyslog, que permite tanto el envío como
la recepción de logs de forma remota. Rsyslog es un software de logging de mensajes que implemente el protocolo básico de Syslog y lo extiende agregando filtros, con una configuración
flexible, una herramienta muy potente que trabaja con protocolos TCP y
UDP (el utilizado en mi caso). Así mismo este servicio crea sus propias tablas MySQL dónde se almacenarán y clasificarán los logs.
Finalmente con una aplicación web creada a partir del framework Yii, podremos
visualizar los logs desde cualquier navegador.Gómez Navarro, JC. (2014). Servidor de Logs Centralizado. http://hdl.handle.net/10251/43428.Archivo delegad
Event selection for dynamical downscaling: a neural network approach for physically-constrained precipitation events
This study presents a new dynamical downscaling strategy for extreme events. It is based on a combination of statistical downscaling of coarsely resolved global model simulations and dynamical downscaling of specific extreme events constrained by the statistical downscaling part. The method is applied to precipitation extremes over the upper Aare catchment, an area in Switzerland which is characterized by complex terrain. The statistical downscaling part consists of an Artificial Neural Network (ANN) framework trained in a reference period. Thereby, dynamically downscaled precipitation over the target area serve as predictands and large-scale variables, received from the global model simulation, as predictors. Applying the ANN to long term global simulations produces a precipitation series that acts as a surrogate of the dynamically downscaled precipitation for a longer climate period, and therefore are used in the selection of events. These events are then dynamically downscaled with a regional climate model to 2 km. The results show that this strategy is suitable to constraint extreme precipitation events, although some limitations remain, e.g., the method has lower efficiency in identifying extreme events in summer and the sensitivity of extreme events to climate change is underestimated
The Role of Aerosol Concentration on Precipitation in a Winter Extreme Mixed-Phase System: The Case of Storm Filomena
Aerosol concentration, size and composition are fundamental in hydrometeor formation processes. Meteorological models often use prescribed aerosol concentrations and a single substance. In this study, we analyze the role of aerosol concentration, acting both as CCN and IN, in the development of precipitation in a mixed phase system in numerical weather simulations. To this end, Storm Filomena was selected as the case study. In such a mixed-phase system, the coexistence of supercooled water with ice crystals, as well as the particular existence of a thermal inversion, led to the formation of precipitation in the form of rain, snow and graupel. Several high resolution experiments varying the fixed background aerosol concentration as well as a simulation with an interactive aerosol calculation were performed by means of the WRF-Chem model, using the same physics suite, domain and driving conditions. Results show that the total precipitation remains basically unaltered, with maximum changes of 5%; however, the production of snow is heavily modified. The simulation with maximum prescribed aerosol concentration produced 27% more snow than the interactive aerosol simulation, and diminished the graupel (74%) and rain production (28%). This redistribution of precipitation is mainly linked to the fact that under fixed ice crystal population the variation of aerosol concentration translates into changes in the liquid water content and droplet size and number concentration, thus altering the efficiency of precipitation production. In addition, while modifying the prescribed aerosol concentration produces the same precipitation pattern with the concentration modulating the precipitation amount, interactive aerosol calculation leads to a different precipitation pattern due to the spatial and temporal variability induced in the dynamical aerosol distribution
Pseudo-proxy tests of the analogue method to reconstruct spatially resolved global temperature during the Common Era
This study addresses the possibility of carrying out spatially resolved global reconstructions of annual mean temperature using a worldwide network of proxy records and a method based on the search of analogues. Several variants of the method are evaluated, and their performance is analysed. As a test bed for the reconstruction, the PAGES 2k proxy database (version 1.9.0) is employed as a predictor, the HadCRUT4 dataset is the set of observations used as the predictand and target, and a set of simulations from the PMIP3 simulations are used as a pool to draw analogues and carry out pseudo-proxy experiments (PPEs). The performance of the variants of the analogue method (AM) is evaluated through a series of PPEs in growing complexity, from a perfect-proxy scenario to a realistic one where the pseudo-proxy records are contaminated with noise (white and red) and missing values, mimicking the limitations of actual proxies. Additionally, the method is tested by reconstructing the real observed HadCRUT4 temperature based on the calibration of real proxies. The reconstructed fields reproduce the observed decadal temperature variability. From all the tests, we can conclude that the analogue pool provided by the PMIP3 ensemble is large enough to reconstruct global annual temperatures during the Common Era. Furthermore, the search of analogues based on a metric that minimises the RMSE in real space outperforms other evaluated metrics, including the search of analogues in the range-reduced space expanded by the leading empirical orthogonal functions (EOFs). These results show how the AM is able to spatially extrapolate the information of a network of local proxy records to produce a homogeneous gap-free climate field reconstruction with valuable information in areas barely covered by proxies and make the AM a suitable tool to produce valuable climate field reconstructions for the Common Era
Comparando downscaling dinámico y estadístico en aplicaciones paleoclimáticas
Ponencia presentada en: XI Congreso de la Asociación Española de Climatología celebrado en Cartagena entre el 17 y el 19 de octubre de 2018.[ES]La basta resolución espacial de los Modelos de Circulación General (GCM) supone un cuello de botella que limita su aplicabilidad en estudios centrados en escalas regionales o locales, lo que requiere el uso de técnicas de downscaling. En este trabajo utilizamos una simulación de 500 años con un Modelo Climático Regional (RCM) anidado a un GCM para comparar las técnicas de downscaling dinámico y estadístico, evaluando así la capacidad de este último de reproducir las principales características de las variables que se intentan simular. Para llevar a cabo el downscaling estadístico, usamos 40 años de datos para calibrar un modelo basado en redes neuronales artificiales con el fin de reproducir dos variables diarias de interés: temperatura máxima y precipitación. Posteriormente usamos esa calibración para extender esas series hasta cubrir el periodo completo de 500 años. Los resultados muestran una alta correlación temporal entre las variables obtenidas por ambos métodos, así como similar covariabilidad espacial entre localizaciones. Concretamente, para la temperatura máxima, la correlación es de 0.89, apreciándose además un ciclo anual, con valores más altos en verano que en invierno. Para la precipitación el dato desciende a 0.6, lo cual se debe fundamentalmente a limitaciones en la red neuronal para reproducir los eventos más extremos. Estos resultados indican que, si bien el downscaling dinámico no puede ser completamente sustituido por un enfoque estadístico, éste permite obtener una primera aproximación al comportamiento de ciertas variables evitando o minimizando gran parte del coste computacional.[EN]The coarse spatial resolution of General Circulation Models (GCMs) is a bottleneck that limits its applicability in studies focused on regional to local scales, demanding the implementation of downscaling techniques. In this work, we use a 500-year simulation with a Regional Climate Model (RCM) nested to a GCM to compare dynamical and statistical downscaling, thus evaluating the ability of the latter to mimic the main characteristics of the variables that are being simulated, but at a lower computational cost. To carry out the statistical downscaling, we use 40 years of data to calibrate a model based on Artificial Neural Networks in order to reproduce two daily variables of interest: maximum temperature and precipitation. We then use this calibration to extend the series to span the entire 500-year period. The results show a high temporal correlation between the variables obtained by both methods, as well as a similar spatial covariabilities across locations. Specifically, for the máximum temperature, the correlation is 0.89, albeit with a marked annual cycle and higher values in summer than in winter. For precipitation, correlation diminishes to 0.62, mainly due to limitations in the neural network to reproduce the most extreme events. These results indicate that, although dynamic downscaling can not be completely substituted by a statistical approach, the latter is still useful to obtain a first approximation to the behavior of certain variables, avoiding or minimizing the computational cost that the former entails
Attributing trends in extremely hot days to changes in atmospheric dynamics
This paper presents a method for attributing regional trends in the frequency of extremely hot days (EHDs) to changes in the frequency of the atmospheric patterns that characterize such extraordinary events. The study is applied to mainland Spain and the Balearic Islands for the extended summers of the period 1958–2008, where significant and positive trends in maximum temperature (Tx) have been reported during the second half of the past century.This study was supported by the Spanish government and the Fondo Europeo de Desarrollo Regional (FEDER) through the projects SPEQTRES (CGL2011-29672-C02-02) and REPAIR (CGL2014-59677-R). J. P. Montavez also acknowledges the financial support from Fundacion Seneca (Ref 19640/EE/14)
- …