4,450 research outputs found

    Debates—Stochastic subsurface hydrology from theory to practice: why stochastic modeling has not yet permeated into practitioners?

    Get PDF
    This is the peer reviewed version of the following article: [Sanchez-Vila, X., and D. Fernàndez-Garcia (2016), Debates—Stochastic subsurface hydrology from theory to practice: Why stochastic modeling has not yet permeated into practitioners?, Water Resour. Res., 52, 9246–9258, doi:10.1002/2016WR019302], which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1002/2016WR019302/abstract. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-ArchivingWe address modern topics of stochastic hydrogeology from their potential relevance to real modeling efforts at the field scale. While the topics of stochastic hydrogeology and numerical modeling have become routine in hydrogeological studies, nondeterministic models have not yet permeated into practitioners. We point out a number of limitations of stochastic modeling when applied to real applications and comment on the reasons why stochastic models fail to become an attractive alternative for practitioners. We specifically separate issues corresponding to flow, conservative transport, and reactive transport. The different topics addressed are emphasis on process modeling, need for upscaling parameters and governing equations, relevance of properly accounting for detailed geological architecture in hydrogeological modeling, and specific challenges of reactive transport. We end up by concluding that the main responsible for nondeterministic models having not yet permeated in industry can be fully attributed to researchers in stochastic hydrogeology.Peer ReviewedPostprint (author's final draft

    In vitro study of the effect of RSV in the expression of antioxidant enzyme superoxide dismutase-2 in cultured rat sinoviocites

    Get PDF
    Traballo fin de grao (UDC.CIE). Bioloxía. Curso 2016/2017[Resumen] La artritis reumatoide es una enfermedad crónica, autoinmune, inflamatoria y sistémica que afecta a un 1% de la población mundial. La enfermedad está promovida por el exceso de producción de especies reactivas de oxígeno e inducido, entre otras, por citoquinas como el factor de necrosis tumoral (TNF-α) o la interleucina-1β (IL-1β), que generan un estrés oxidativo en las células de la articulación, las cuales aumentan sus niveles de enzimas como la superóxido dismutasa-2 (SOD2) para intentar combatir este estrés. El resveratrol es un polifenol natural al que se le atribuyen numerosos beneficios, incluidos efectos antiinflamatorios y antioxidantes. El objetivo del presente estudio fue valorar el efecto del resveratrol sobre los niveles de la SOD2 en sinoviocitos de rata en cultivo. Para ello se realizó un cultivo de sinoviocitos de rata, que fueron sometidos a diferentes condiciones, y se llevaron a cabo las técnicas de western-blot e inmunofluorescencia, para medir los niveles de la enzima SOD2 en las diferentes condiciones testadas. Los resultados se vio una disminución de los niveles de la enzima en aquellas células preincubadas con resveratrol antes de ser expuestos a citoquinas proinflamatorias, poniéndose de manifiesto la actividad antiinflamatoria y antioxidante del resveratrol.[Resumo] A artrite reumatoide é unha enfermidade crónica, autoinmune, inflamatoria e sistémica que afecta o 1% da población mundial. A enfermidade é promovida polo exceso de producción de especies reactivas de osíxeno e inducido, entre outros, por citoquinas como o factor de necrose tumoral (TNF-α) ou a interleucina-1β (IL-1β), que xeran estrés oxidativo nas células da articulación, que aumentan os seus niveis de enzimas como a superóxido dismutase-2 (SOD2) para tratar de combater este estrés. O resveratrol é un polifenol natural, que é atribuida numerosos beneficios, incluindo efectos antiinflamatorios e antioxidativos. O obxectivo deste estudo foi avaliar o efecto do resveratrol sobre os niveis de SOD2 en sinoviocitos de rato cultivada. Para isto se fixo un cultivo de sinoviocitos de rato, que foron sometidos a condicions distintas, e foron realizadas técnicas de western-blot e inmunofluorescencia, para medir os niveis de SOD2 nas diferentes condicions examinadas. Os resultados era unha disminución dos niveis de enzimas nas células preincubadas có resveratrol antes de ser expostos a citoquinas proinflamatorias, confirmando así a actividade antiinflamatoria e antioxidativa do resveratrol.[Abstract] Rheumatoid arthritis is a chronic, autoimmune, inflammatory and systemic disease affecting the 1% of the world’s population. It is promote by the excess production in reactive oxygen species induced by, among other, cytokines such as tumor necrosis factor (TNF-α) or interleukin-1β (IL-1β), which generates an oxidative stress in the joint cells. These increase, in these cells, the levels of enzymes such as superoxide dismutase-2 (SOD2) to fight against this stress. The resveratrol is a natural polyphenol that has numerous benefits, including anti-inflamatory and anti-oxidative effects. The aim of the present study was to evaluate the effect of resveratrol on the levels of SOD2 in rat synoviocytes in culture. For this purpose a culture of rat synoviocites was performed, which were subjected to different conditions, and western-blot and inmunofluorescence techniques were performed to measure the levels of the SOD2 enzyme under the different tested conditions. The results showed a decrease in enzyme levels in those cells preincubated with resveratrol before being exposed to proinflammatory cytokines, showing the anti-inflamatory and anti-oxidative activity of resveratrol

    Stochastic estimation of hydraulic transmissivity fields using flow connectivity indicator data

    Get PDF
    This is the peer reviewed version of the following article: [Freixas, G., D. Fernàndez-Garcia, and X. Sanchez-Vila (2017), Stochastic estimation of hydraulic transmissivity fields using flow connectivity indicator data, Water Resour. Res., 53, 602–618, doi:10.1002/2015WR018507], which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1002/2015WR018507/abstract. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.Most methods for hydraulic test interpretation rely on a number of simplified assumptions regarding the homogeneity and isotropy of the underlying porous media. This way, the actual heterogeneity of any natural parameter, such as transmissivity ( math formula), is transferred to the corresponding estimates in a way heavily dependent on the interpretation method used. An example is a long-term pumping test interpreted by means of the Cooper-Jacob method, which implicitly assumes a homogeneous isotropic confined aquifer. The estimates obtained from this method are not local values, but still have a clear physical meaning; the estimated math formula represents a regional-scale effective value, while the log-ratio of the normalized estimated storage coefficient, indicated by math formula, is an indicator of flow connectivity, representative of the scale given by the distance between the pumping and the observation wells. In this work we propose a methodology to use math formula, together with sampled local measurements of transmissivity at selected points, to map the expected value of local math formula values using a technique based on cokriging. Since the interpolation involves two variables measured at different support scales, a critical point is the estimation of the covariance and crosscovariance matrices. The method is applied to a synthetic field displaying statistical anisotropy, showing that the inclusion of connectivity indicators in the estimation method provide maps that effectively display preferential flow pathways, with direct consequences in solute transport.Peer ReviewedPostprint (published version

    Refactoring software to heterogeneous parallel platforms

    Get PDF
    In summary, the papers included in this special issue are representative of the progress achieved by the research community at various levels from the very high level using parallel patterns to lower levels using, for example, transactional software memory. Also the integration of GPUs and FPGAs in the landscape is essential to achieve better performance in different categories of applications. All these innovative research directions will contribute to better achieve the long-term goal of better refactoring of existing applications to new and evolving parallel heterogeneous architectures

    Adaptive Comfort Models Applied to Existing Dwellings in Mediterranean Climate Considering Global Warming

    Get PDF
    Comfort analysis of existing naturally ventilated buildings located in mild climates, such as the ones in the Mediterranean zones, offer room for a reduction in the present and future energy consumption. Regarding Spain, most of the present building stock was built before energy standards were mandatory, let alone considerations about global warming or adaptive comfort. In this context, this research aims at assessing adaptive thermal comfort of inhabitants of extant apartments building in the South of Spain per EN 15251:2007 and ASHRAE 55-2013. The case study is statistically representative housing built in 1973. On-site monitoring of comfort conditions and computer simulations for present conditions have been carried out, clarifying the degree of adaptive comfort at present time. After that, additional simulations for 2020, 2050, and 2080 are performed to check whether this dwelling will be able to provide comfort considering a change in climate conditions. As a result, the study concludes that levels of adaptive comfort can be considered satisfactory at present time in these dwellings, but not in the future, when discomfort associated with hot conditions will be recurrent. These results provide a hint to foresee how extant dwellings, and also dwellers, should adapt to a change in environmental conditions

    A Highly Available Cluster of Web Servers with Increased Storage Capacity

    Get PDF
    Ponencias de las Decimoséptimas Jornadas de Paralelismo de la Universidad de Castilla-La Mancha celebradas el 18,19 y 20 de septiembre de 2006 en AlbaceteWeb servers scalability has been traditionally solved by improving software elements or increasing hardware resources of the server machine. Another approach has been the usage of distributed architectures. In such architectures, usually, file al- location strategy has been either full replication or full distribution. In previous works we have showed that partial replication offers a good balance between storage capacity and reliability. It offers much higher storage capacity while reliability may be kept at an equivalent level of that from fully replicated solutions. In this paper we present the architectural details of Web cluster solutions adapted to partial replication. We also show that partial replication does not imply a penalty in performance over classical fully replicated architectures. For evaluation purposes we have used a simulation model under the OMNeT++ framework and we use mean service time as a performance comparison metric.Publicad

    Improving performance and maintainability through refactoring in C++11

    Get PDF
    Abstraction based programming has been traditionally seen as an approach that improves software quality at the cost of losing performance. In this paper, we explore the cost of abstraction by transforming the PARSEC benchmark uidanimate application from low-level, hand-optimized C to a higher-level and more general C++ version that is a more direct representation of the algorithms. We eliminate global variables and constants, use vectors of a user-de ned particle type rather than vectors of built-in types, and separate the concurrency model from the application model. The result is a C++ program that is smaller, less complex, and measurably faster than the original. The benchmark was chosen to be representative of many applications and our transformations are systematic and based on principles. Consequently, our techniques can be used to improve the performance, exibility, and maintainability of a large class of programs. The handling of concurrency issues has been collected into a small new library, YAPL.J. Daniel Garcia's work was partially supported by FundaciĂłn CajaMadrid through their grant programme for Madrid University Professors. Bjarne Stroustrup's work was partially supported by NSF grant #083319

    Adaptive comfort control implemented model (ACCIM) for energy consumption predictions in dwellings under current and future climate conditions: A case study located in Spain

    Get PDF
    Currently, the knowledge of energy consumption in buildings of new and existing dwellings is essential to control and propose energy conservation measures. Most of the predictions of energy consumption in buildings are based on fixed values related to the internal thermal ambient and pre-established operation hypotheses, which do not reflect the dynamic use of buildings and users’ requirements. Spain is a clear example of such a situation. This study suggests the use of an adaptive thermal comfort model as a predictive method of energy consumption in the internal thermal ambient, as well as several operation hypotheses, and both conditions are combined in a simulation model: the Adaptive Comfort Control Implemented Model (ACCIM). The behavior of ACCIM is studied in a representative case of the residential building stock, which is located in three climate zones with different characteristics (warm, cold, and mild climates). The analyses were conducted both in current and future scenarios with the aim of knowing the advantages and limitations in each climate zone. The results show that the average consumption of the current, 2050, and 2080 scenarios decreased between 23% and 46% in warm climates, between 19% and 25% in mild climates, and between 10% and 29% in cold climates by using such a predictive method. It is also shown that this method is more resilient to climate change than the current standard. This research can be a starting point to understand users’ climate adaptation to predict energy consumption

    Debates—Stochastic subsurface hydrology from theory to practice: why stochastic modeling has not yet permeated into practitioners?

    Get PDF
    We address modern topics of stochastic hydrogeology from their potential relevance to real modeling efforts at the field scale. While the topics of stochastic hydrogeology and numerical modeling have become routine in hydrogeological studies, nondeterministic models have not yet permeated into practitioners. We point out a number of limitations of stochastic modeling when applied to real applications and comment on the reasons why stochastic models fail to become an attractive alternative for practitioners. We specifically separate issues corresponding to flow, conservative transport, and reactive transport. The different topics addressed are emphasis on process modeling, need for upscaling parameters and governing equations, relevance of properly accounting for detailed geological architecture in hydrogeological modeling, and specific challenges of reactive transport. We end up by concluding that the main responsible for nondeterministic models having not yet permeated in industry can be fully attributed to researchers in stochastic hydrogeology

    Mathematical equivalence between time-dependent single-rate and multirate mass transfer models

    Get PDF
    The often observed tailing of tracer breakthrough curves is caused by a multitude of mass transfer processes taking place over multiple scales. Yet, in some cases, it is convenient to fit a transport model with a single-rate mass transfer coefficient that lumps all the non-Fickian observed behavior. Since mass transfer processes take place at all characteristic times, the single-rate mass transfer coefficient derived from measurements in the laboratory or in the field vary with time w(t). The literature review and tracer experiments compiled by Haggerty et al. (2004) from a number of sites worldwide suggest that the characteristic mass transfer time, which is proportional to w(t)^-1, scales as a power law of the advective and experiment duration. This paper studies the mathematical equivalence between the multirate mass transfer model (MRMT) and a time-dependent single-rate mass transfer model (t-SRMT). In doing this, we provide new insights into the previously observed scale-dependence of mass transfer coefficients. The memory function, g(t), which is the most salient feature of the MRMT model, determines the influence of the past values of concentrations on its present state. We found that the t-SRMT model can also be expressed by means of a memory function \phi(t,\tau). In this case, though the memory function is nonstationary, meaning that in general it cannot be written as \phi(t-\tau). Nevertheless, the full behavior of the concentrations using a single time-dependent rate w(t) is approximately analogous to that of the MRMT model provided that the equality w(t) = -dlng(t)/dt holds and the field capacity is properly chosen. This relationship suggests that when the memory function is a power law, g(t) \approx t^{1-k}, the equivalent mass transfer coefficient scales as w(t) \approx t^-1, nicely fitting without calibration the estimated mass transfer coefficients compiled by Haggerty et al. (2004)
    • …
    corecore