629 research outputs found
A Practical Data-Driven Multi-Model Approach to Model Predictive Control: Results from Implementation in an Institutional Building
Model-based Predictive Control (MPC) is an effective solution to improve building controls. It consists of the use of weather and occupancy forecasts along with a control-oriented model to predict the behaviour of the building a few hours or days ahead, and thus optimize the operation of its systems. Although the potential of MPC is widely recognized, and plentiful operational data is often available, the development of a model requires a great deal of effort, significant technical expertise and knowledge of building systems. The challenge of creating a model is a hurdle that makes the on-site implementation of MPC in buildings relatively rare. This study tackles the development of a multi-model approach to optimize the operation of electric and natural gas boilers in an institutional building to reduce greenhouse gas (GHG) emissions while maintaining the required level of comfort. This methodology leverages Machine Learning techniques to rapidly develop and calibrate control-oriented models using a limited number of input variables (indoor air temperature and temperature set-points, weather conditions, power meter data). The proposed multi-model approach consists of five models used to estimate the building total heating demand, the electric baseload, the natural gas boiler power, and the indoor air temperature under free floating conditions and during warming-up periods in the morning. The models are calibrated and validated with operational data and they are then used to optimize the transition between nighttime and daytime indoor air temperature. Since these are black-box models that require only a basic understanding of the building system and a few inputs, the model development was considerably reduced while the modularity of the proposed method makes it flexible. Such an approach could therefore be easily replicated in other buildings equipped with similar pieces of equipment. This methodology has been implemented in a Canadian institutional building, located in Varennes (QC). Results in 2020-21 showed that the COVID-19 pandemic has significantly impacted building performance and reduced energy use, thus creating a new baseline. The MPC strategy allowed to achieve an additional 20.2% GHG emission reduction compared to this new baseline while thermal comfort was improved. Nevertheless, energy costs increased, which was mainly due to the impact of the pandemic, which eventually made the pre-COVID-19 model and optimization parameters outdated; lower costs are expected with model recalibration, currently ongoing
Fractal and compositional analysis of soil aggregation
A soil aggregate is made of closely packed sand, silt, clay, and organic particles building up
soil structure. Soil aggregation is a soil quality index integrating the chemical, physical, and
biological processes involved in the genesis of soil structure and tilth. Aggregate
size distribution is determined by sieving a fixed amount of soil mass under mechanical stress and
is commonly synthesized by the mean weight diameter (MWD) and fractal dimensions such as the
fragmentation fractal dimensions (D f). A fractal is a rough object that can be broken down into a
number of reduced-size copies of the original object. Equations have been developed to compute
bounded and unbounded scaling factors as measures of fractal dimensions based on
assumptions about average diameter, bulk density, shape and probability of failure of sieved
particles. The log-log relationship between particle diameter and cumulative number or mass of
aggregates or soil particles above a given diameter often shows more or less uniform fractal
patterns. Multi-fractal (slopes showing several D f values ≤ 3) and non fractal
patterns or incomplete fragmentation ( D f 3) have been reported. Scaling
factors are curve- fitting parameters that are very sensitive to the choice of the fractal
domain about breakpoints. Compositional data analysis using sequential binary partitions for
isometric log ratio (ilr)
coordinates with orthonormal basis provides a novel approach that avoids the assumptions
and dimensional constraints of fractal analysis. Our objective was to compare MWD, fractal scaling
factors and ilr coordinates using published data. In the first dataset, MWD was found to be biased
by excessively high weight being given to the largest aggregate-size. Eight ilr coordinates
contrasting micro- vs. macro-aggregates were related to fragmentation fractal dimensions,
most of which were below 2 or above 3, a theoretical impossibility for geometric fractals. The
critical ilr value separating scaling factors 3 and > 3 was close to zero. In a second dataset,
the Aitchison distance computed across ilr coordinates proved to be a useful measure of the
degree of soil aggregation, agradation or degradation against a reference composition such
as that of primary particles, bare fallow or permanent grass. The individual contributions of ilr
coordinates to the Aitchison distance can be interpreted in terms of sign and amplitude and be
related to soil properties and processes mediated by soil aggregation
From dual to unified employment protection: Transition and steady state
Three features of real-life reforms of dual employment protection legislation (EPL) systems are particularly hard to study through the lens of standard labor-market search models: (i) the excess job turnover implied by dual EPL, (ii) the nonretroactive nature of EPL reforms, and (iii) the transition dynamics from dual to a unified EPL system. In this paper, we develop a computationally tractable model addressing these issues. Our main finding is that the welfare gains of reforming a dual EPL system are sizeable and achieved mostly through a decrease in turnover at short job tenures. This conclusion continues to hold in more general settings featuring wage rigidities, heterogeneity in productivity upon matching, and human capital accumulation. We also find substantial cross-sectional heterogeneity in welfare effects along the transition to a unified EPL scheme. Given that the model is calibrated to data from Spain, often considered as the epitome of a labor market with dual EPL, our results should provide guidance for a wide range of reforms of dual EPL systems
Boron concentration profiling by high angle annular dark field-scanning transmission electron microscopy in homoepitaxial delta-doped diamond layers
To develop further diamond related devices, the concentration and spatial location of dopants should be controlled down to the nanometer scale. Scanning transmission electron microscopy using the high angle annular dark field mode is shown to be sensitive to boron doping in diamond epilayers. An analytical procedure is described, whereby local boron concentrations above 1020 cm-3 were quantitatively derived down to nanometer resolution from the signal dependence on thickness and
boron content. Experimental boron local doping profiles measured on diamond p-/p++/p- multilayers are compared to macroscopic profiles obtained by secondary ion mass spectrometry, avoiding reported artefacts.4 page
Double-walled carbon nanotubes trigger IL-1β release in human monocytes through Nlrp3 inflammasome activation
Because of their outstanding physical properties, carbon nanotubes (CNTs) are promising new materials in the field of nanotechnology. It is therefore imperative to assess their adverse effects on human health. Monocytes/macrophages that recognize and eliminate the inert particles constitute the main target of CNTs. In this article, we report our finding that double-walled CNTs (DWCNTs) synergize with Tolllike receptor agonists to enhance IL-1β release in human monocytes. We show that DWCNTs–induced IL-1β secretion is exclusively linked to caspase-1 and to Nlrp3 inflammasome activation in human monocytes. We also establish that this activation requires DWCNTs phagocytosis and potassium efflux, but not reactive oxygen specied (ROS) generation. Moreover, inhibition of lysosomal acidification or cathepsin-B activation reduces DWCNT-induced IL-1β secretion, suggesting that Nlrp3 inflammasome activation occurs via lysosomal destabilization. Thus, DWCNTs present a health hazard due to their capacity to activate Nlrp3 inflammasome, recalling the inflammation caused by asbestos and hence demonstrating that they should be used with caution. From the Clinical Editor: This is a very important biosafety/toxicity study regarding double walled carbon nanotubes. The investigators demonstrate that such nanotubes do represent a health hazard due to their capacity to activate Nlrp3 inflammasome, resembling the inflammation caused by asbestos. While further study of this phenomenon is definitely needed, the above findings clearly suggest that special precautions need to be taken when applying these nanoparticles in human disease research
Programming stiff inflatable shells from planar patterned fabrics
Lack of stiffness often limits thin shape-shifting structures to small
scales. The large in-plane transformations required to distort the metrics are
indeed commonly achieved by using soft hydrogels or elastomers. We introduce
here a versatile single-step method to shapeprogram stiff inflated structures,
opening the door for numerous large scale applications, ranging from space
deployable structures to emergency shelters. This technique relies on channel
patterns obtained by heat-sealing superimposed flat quasi-inextensible fabric
sheets. Inflating channels induces an anisotropic in-plane contraction and thus
a possible change of Gaussian curvature. Seam lines, which act as a director
field for the in-plane deformation, encode the shape of the deployed structure.
We present three patterning methods to quantitatively and analytically program
shells with non-Euclidean metrics. In addition to shapes, we describe with
scaling laws the mechanical properties of the inflated structures. Large
deployed structures can resist their weight, substantially broadening the
palette of applications.Comment: 6 pages, 4 figures and Supplementary Information (14 pages, 3
figures
Analytical validation of the new plasma calibrated Accu-Chek (R) Test Strips (Roche Diagnostics)
peer reviewedBackground: The Accu-Chek Inform glucose monitor is a point-of-care system for testing blood glucose. New test strips, calibrated to deliver glucose plasma-like values, were launched on the market in May 2005. The aim of our study was to perform analytical validation of these new strips. Methods: We compared the new plasma strips with whole blood strips; results for the plasma strips with plasma values obtained using a clinical analyzer and with whole blood values given by the glucose electrode of a blood gas analyzer; and the influence of the type of blood (capillary or venous) on the results obtained by the glucose monitor with the plasma calibrated strips. Results: Plasma strips give on average 7% higher results than the previous whole blood strips. However, the results given by the plasma strips on capillary whole blood, even if well correlated, are not completely comparable with those given by an analyzer for venous plasma. Nevertheless, these plasma strips and the glucose electrode of a blood gas analyzer give comparable results. Conclusions: Accu-Chek Inform plasma strips are a good method for monitoring of blood glucose values in patients with diabetes
Stability Evaluation of Overtopped Concrete Hydraulic Structures Using Computational Fluid Dynamics
Résumé: L’estimation des pressions hydrodynamiques induites lorsque l’eau à grande vitesse submerge les barrages-poids et les évacuateurs de crues constitue un aspect particulièrement difficile de l’évaluation de la stabilité de ces ouvrages. La dynamique des fluides numérique (CFD) est une alternative attrayante aux modèles physiques pour quantifier les forces hydrodynamiques agissant sur les ouvrages-poids. Dans cet article, les ligne directrices sur la sécurité des barrages pour estimer le poids de la nappe d’eau submergeant les barrages-poids à crête rectangulaire sont tout d’abord examinées. Ensuite, une méthodologie CFD est développée pour améliorer l'estimation simplifiée des champs de pression hydrodynamiques agissant sur les crêtes rectangulaires des ouvrages soumis à la submersion. Les pressions CFD sont utilisées comme données d'entrée dans les analyses de stabilité structurales classiques, basées sur la méthode de la gravité, afin de quantifier de manière plus adéquate la stabilité au glissement des barrages lors d'une submersion. Un retour d’expérience est également effectué sur la stabilité d'un évacuateur vanné, qui a été submergé lors de la crue du Saguenay en 1996. ---------- Abstract: A particularly challenging aspect in gravity dam stability assessment is the estimation of the induced hydrodynamic water pressure when water with significant velocity is overtopping gravity dams and flowing in or over spillway components. The water flow conditions, including the related pressure fields and resultant forces, are difficult to quantify accurately. Herein, existing dam safety guidelines to estimate the weight of the overflowing water nappe on gravity dams with rectangular crests are first reviewed. Then, a CFD methodology is developed to improve the simplified estimation of hydrodynamic pressure fields acting on the rectangular crests of submerged gravity dams. The CFD pressures are used as input data to classical structural stability analyses based on the gravity method to more adequately quantify the dam stability during overtopping. A back analysis is also performed on the stability of an existing gated spillway that was overtopped during the 1996 Saguenay flood in Québec
Systemic plant protection substances and products: how to assess the risk for bees? A beekeepers point of view
contribution to session II
Test and risk assessment
Background: The current plant protection products (PPPs) assessment is no more suitable when applied to systemic substances since systemic chemicals can contaminate nectar and pollen during a long length of time. Largely focused on the acute toxicity, the current assessment scheme does not take into account several elements i.e. the chronic toxicity, the possible synergies between substances, and between pathogens and PPPs. Possible bee contamination through nectar and pollen leads to a specific exposure, mainly oral, concerning the hive bees, including larvae, drones and queens, as well as potentially delayed through the stored honey and pollen consumption. Moreover, regarding the long-term exposure, sublethal chronic effects should be taken into account.
Results: For such substances we would take both the chronic toxicity and the acute toxicity measurements into consideration. Therefore the TER should be calculated based on the lowest LD50 and in the case of risk, the PEC/PNEC ratio should be measured and calculated for various behaviours. A larvae test should also be performed. Tunnel tests may be helpful but the exposure to the PPP cannot be proven and the bee behaviour observation is currently inaccurate. Further research on the effect of small doses of PPP on the bee immune system seems more than necessary.
Conclusion: A new assessment scheme, which takes these parameters into account, is the core of ourcontribution.
Keywords: Assessment scheme, chronic toxicity, sublethal toxicity, synergies, larvae test, PEC, PNEC, TER
Tactile perception in the sensory comfort of fabric samples
This investigation aims to present the process of development of the attributes that will
form the textile lexicon of the northeast region of Brazil for the assessment of textile tactile comfort.
For this purpose, the following were adapted: ISO 11035:1994 Sensory analysis – Identification and
selection of descriptors for establishing a sensory profile by a multidimensional approach; and ISO
8586:2012 Sensory analysis – General guidelines for the selection, training and monitoring of
selected assessors and expert sensory assessor’sstandards concerning the areas of food and cosmetics.
Three panels of naive assessors from three different cities in northeastern Brazil were invited to touch
8 samples of fabrics with different textures and compositions. Initially, they generated 322 terms.
Afterwards, in a second phase, they qualitatively eliminated terms with the same meaning. In the third
phase, they grouped the terms through similarity analysis, which resulted in 23 terms. Then, the terms
that were most cited by the assessors were analyzed and, as a result, 4 terms were eliminated. Finally,
similarities were evidenced in the haptic perceptions that were perceived by the different panels, due
to their cultural proximity and the fact that they come from the same region of a country with
continental dimensions, as well as the transformation of human subjectivity into objective parameters
provided by the textile lexiconThe authors gratefully acknowledge the funding by the project UID/CTM/00264/2019 of 2C2T – Centro de Ciência e Tecnologia Têxtil, funded by National Funds through FCT/MCTES
- …