54 research outputs found

    The potential implications of reclaimed wastewater reuse for irrigation on the agricultural environment: the knowns and unknowns of the fate of antibiotics and antibiotic resistant bacteria and resistance genes – a review

    Get PDF
    The use of reclaimed wastewater (RWW) for the irrigation of crops may result in the continuous exposure of the agricultural environment to antibiotics, antibiotic resistant bacteria (ARB) and antibiotic resistance genes (ARGs). In recent years, certain evidence indicate that antibiotics and resistance genes may become disseminated in agricultural soils as a result of the amendment with manure and biosolids and irrigation with RWW. Antibiotic residues and other contaminants may undergo sorption/desorption and transformation processes (both biotic and abiotic), and have the potential to affect the soil microbiota. Antibiotics found in the soil pore water (bioavailable fraction) as a result of RWW irrigation may be taken up by crop plants, bioaccumulate within plant tissues and subsequently enter the food webs; potentially resulting in detrimental public health implications. It can be also hypothesized that ARGs can spread among soil and plant-associated bacteria, a fact that may have serious human health implications. The majority of studies dealing with these environmental and social challenges related with the use of RWW for irrigation were conducted under laboratory or using, somehow, controlled conditions. This critical review discusses the state of the art on the fate of antibiotics, ARB and ARGs in agricultural environment where RWW is applied for irrigation. The implications associated with the uptake of antibiotics by plants (uptake mechanisms) and the potential risks to public health are highlighted. Additionally, knowledge gaps as well as challenges and opportunities are addressed, with the aim of boosting future research towards an enhanced understanding of the fate and implications of these contaminants of emerging concern in the agricultural environment. These are key issues in a world where the increasing water scarcity and the continuous appeal of circular economy demand answers for a long-term safe use of RWW for irrigation.info:eu-repo/semantics/acceptedVersio

    Grammatical Evolution-Based Feature Extraction for Hemiplegia Type Detection

    No full text
    Hemiplegia is a condition caused by brain injury and affects a significant percentage of the population. The effect of patients suffering from this condition is a varying degree of weakness, spasticity, and motor impairment to the left or right side of the body. This paper proposes an automatic feature selection and construction method based on grammatical evolution (GE) for radial basis function (RBF) networks that can classify the hemiplegia type between patients and healthy individuals. The proposed algorithm is tested in a dataset containing entries from the accelerometer sensors of the RehaGait mobile gait analysis system, which are placed in various patients’ body parts. The collected data were split into 2-second windows and underwent a manual pre-processing and feature extraction stage. Then, the extracted data are presented as input to the proposed GE-based method to create new, more efficient features, which are then introduced as input to an RBF network. The paper’s experimental part involved testing the proposed method with four classification methods: RBF network, multi-layer perceptron (MLP) trained with the Broyden–Fletcher–Goldfarb–Shanno (BFGS) training algorithm, support vector machine (SVM), and a GE-based parallel tool for data classification (GenClass). The test results revealed that the proposed solution had the highest classification accuracy (90.07%) compared to the other four methods

    Advanced Simulation of Uncertain Quantities in Civil Engineering with Applications to Regional Hazards

    No full text
    During a natural extreme event, such as an earthquake or a hurricane, the amount of socioeconomic losses due to inefficient disaster response or the losses due to long-term reduction of functionality of infrastructure systems is comparable, if not higher, to the immediate losses due to the extreme event itself. Therefore, the scientific community has recognized the need to be able to accurately predict the performance of the lifelines and infrastructure when a extreme event occurs and also being able to build structures and infrastructures which are able to successfully withstand extreme events, effectively satisfy the needs of the post-event emergency response, and restore its functionality as soon as possible. One of the first challenges in performing such tasks is to accurately assess the load of an extreme event to the geographic region the infrastructure or lifeline belongs in. Hazard maps and probability of exceedance curves are very popular tools used initially for the probabilistic seismic hazard analysis and expanded later to other hazards such as hurricanes. These tools provide the probability of exceeding any given value of an Intensity Measure (IM) of choice (e.g., 1-minute sustained wind speed ) at any location. These tools are an integral part of the performance-based design approach and are essential for the probabilistic analysis of individual structures. However, these tools are not appropriate for the analysis of distributed infrastructure systems because they do not account for the correlation information among the values of the IM at different locations. Engineers have recognized that the various network components cannot be studied independently because the performance of the entire network depends on the combination of the conditions of all members. Thus, considering joint probabilities of having certain values of the IM at the locations of interest if required. The most widely accepted approach by the scientific community to address these issues is through simulation-based techniques. Based on this idea, a set of representative extreme scenarios is selected and then the impact on the network components and, in turn, the performance of the network itself is predicted. The main issue in this approach is the computational cost, which constraints the number of extreme event scenarios to be as small as possible, while the set still captures the probabilistic characteristics of the intensity of the investigated natural extreme event over a region. A new framework is presented for the selection of an optimal set of stochastic intensity measure maps representing the regional hazard over a geographic area. This set of IM maps can subsequently be used for the analysis of spatially distributed infrastructure systems. The proposed methodology results in a versatile multihazard tool that accounts for the spatial correlation through the optimal sampling of IM maps. Its key characteristic is that it embraces the nature of the regional IM maps as two-dimensional random fields. The representation of the regional hazard is supported by proofs of optimality, ensuring mean-square convergence of the ensemble of representative IM maps to the complete portfolio of possible hazard events, which is a particularly important property for risk analysis. A detailed comparison of the proposed technique with other popular methodologies in the same filed is presented. Before applying the proposed technique or any other hazard representation technique, it is necessary to accurately study and characterize the regional hazard in a probabilistic way. Two types of natural phenomena were considered in the conducted research for the regional hazard analysis: the earthquake and the hurricane hazard. In the case of earthquakes, the seismic characterization of the Charleston South Carolina region was studied and a seismic modeling procedure was developed which includes spatial and temporal information and descriptions of fault geometry and style as well as other parameters. With the complete probabilistic description of the regional seismic hazard, the ground-motion prediction equations (GMPE) are implemented. The GMPE account for the in between-earthquake and within-earthquake variability, and result the ground shaking acceleration over the region. In the case of hurricanes, a more holistic approach was adopted by stochastically modeling the hurricane\u27s track. Historical hurricane events, originated either in the Atlantic basin, Caribbean Sea or the Gulf of Mexico, have shown to significantly affect geographic regions located in the South and Eastern U.S. Therefore, a simulation framework is developed for the prediction of hurricane wind intensity and direction over any geographic region in the Southern and East U.S. The proposed framework generates synthetic hurricane directional wind speeds, and does so by utilizing historical data, simulating the hurricane\u27s track and intensity, simulating key characteristic parameters such as the central pressure and the radius to maximum wind among others both for offshore and overland locations of the track, performing a wind field analysis, calculating the 10-meter wind intensity by utilizing oceanic- and land-based boundary layer models and finally simulating offshore and overland wind directions

    Εξέλιξη μιας ετερογενούς υβριδικής ακραίας μηχανής μάθησης

    No full text
    Hybrid optimization algorithms have gained popularity as it has become apparent there cannot be a universal optimization strategy which is globally more beneficial than any other. Despite their popularity, hybridization frameworks require more detailed categorization regarding: the nature of the problem domain, the constituent algorithms, the coupling schema and the intended area of application.This thesis proposes a hybrid algorithm named heterogeneous hybrid extreme learning machine (He-HyELM) for finding the optimal multi-layer perceptron (MLP) with one hidden layer that solves a specific problem. This is achieved by combining the extreme learning machine (ELM) training algorithm with an evolutionary computing (EC) algorithm.The research process is complemented by a series of preliminary experiments prior to hybridization that explore in depth the characteristics of the ELM algorithm. He-HyELM uses a pool of custom created neurons which are then embedded in a series of ELM trained MLPs. A genetic algorithm (GA) evolves these homogeneous networks into heterogeneous networks according to a fitness criterion. The GA utilizes a proposed intelligent novel crossover operator which uses a mechanism to rank each hidden layer node with purpose to guide the evolution process. Having analysed the proposed He-HyELM algorithm in Chapter 5, an enhanced version of the proposed algorithm is presented in Chapter 6. This enhanced version makes the mutation operator self-adaptive with aim to reduce the number of parameters need tuning.Both He-HyELM and SA-He-HyELM approaches are tested in three regression and three classification real-world datasets with purpose to test their performance. These experiments showed that both versions improved generalization when compared with the best homogeneous network found during the ELM empirical study in Chapter 3.Finally, in Chapter 7 we summarize the key findings and contributions of this work.Οι υβριδικοί αλγόριθμοι βελτιστοποίησης έχουν κερδίσει δημοτικότητα καθώς έχει γίνει προφανές ότι δεν μπορεί να υπάρξει μια καθολική στρατηγική βελτιστοποίησης που να είναι πιο ωφέλιμη από οποιαδήποτε άλλη. Παρά τη δημοτικότητά τους, τα πλαίσια υβριδισμού απαιτούν πιο λεπτομερή κατηγοριοποίηση σχετικά με: τη φύση του τομέα του προβλήματος, τους αλγόριθμους που τα συνθέτουν, το σχήμα σύζευξης και την επιδιωκόμενη περιοχή εφαρμογής. Αυτή η διατριβή προτείνει έναν υβριδικό αλγόριθμο που ονομάζεται ετερογενής υβριδική ακραία μηχανή μάθησης για την εύρεση του βέλτιστου πολυστρωματικού perceptron με ένα κρυφό επίπεδο που επιλύει ένα συγκεκριμένο πρόβλημα. Αυτό επιτυγχάνεται συνδυάζοντας τον αλγόριθμο εκπαίδευσης της ακραίας μηχανής μάθησης με έναν αλγόριθμο εξελικτικής υπολογιστικής. Η ερευνητική διαδικασία συμπληρώνεται από μια σειρά προκαταρκτικών πειραμάτων πριν από τον υβριδισμό που διερευνούν σε βάθος τα χαρακτηριστικά του αλγορίθμου ακραίας μηχανής μάθησης. Η ετερογενής υβριδική ακραία μηχανή μάθησης χρησιμοποιεί μια δεξαμενή νευρώνων που στη συνέχεια ενσωματώνονται σε μια σειρά πολυστρωματικών perceptrons εκπαιδευμένων με ακραία μηχανή μάθησης. Ένας γενετικός αλγόριθμος εξελίσσει αυτά τα ομοιογενή δίκτυα σε ετερογενή δίκτυα σύμφωνα με ένα κριτήριο καταλληλότητας. O γενετικός αλγόριθμος χρησιμοποιεί έναν προτεινόμενο, έξυπνο νέο τελεστή διασταύρωσης που χρησιμοποιεί έναν μηχανισμό ταξινόμησης για κάθε κόμβο κρυφού στρώματος με σκοπό να καθοδηγήσει την εξελικτική διαδικασία. Έχοντας αναλύσει τον προτεινόμενο αλγόριθμο στο Κεφάλαιο 5, μια βελτιωμένη έκδοση του προτεινόμενου αλγορίθμου παρουσιάζεται στο Κεφάλαιο 6. Αυτή η βελτιωμένη έκδοση καθιστά τον τελεστή μετάλλαξης αυτοπροσαρμοστικό με στόχο τη μείωση του αριθμού των παραμέτρων που χρειάζονται προσαρμογή. Οι δύο αυτές προσεγγίσεις εξετάζονται σε τρία σύνολα δεδομένων παλινδρόμησης και σε τρία σύνολα δεδομένων ταξινόμησης με σκοπό να ελεγχθεί η απόδοσή τους. Αυτά τα πειράματα έδειξαν ότι και οι δύο εκδόσεις βελτίωσαν τη γενίκευση σε σύγκριση με το καλύτερο ομοιογενές δίκτυο που βρέθηκε κατά την εμπειρική μελέτη στο Κεφάλαιο 3. Τέλος, στο Κεφάλαιο 7 συνοψίζουμε τα βασικά ευρήματα και τις συνεισφορές αυτής της εργασίας

    Efficient computational models for the optimal representation of correlated regional hazard

    No full text
    In this paper, a methodology is presented for the generation of an optimal set of maps representing the intensity of a natural disaster over a region. In regional hazard and loss analysis, maps like these are commonly used to compute the probability of exceeding certain levels of intensity at all sites, while also providing information on the correlation among the intensity at any pair of sites. The information on the spatial correlation between two locations is of utmost importance for the accurate disaster performance assessment of lifeline components and of distributed systems. However, traditional hazard maps (such as those provided by USGS) do not provide this essential information, but only the probability of exceedance of a specific intensity at the various sites, considered individually. Therefore, many researches have attempted to address this problem and incorporate correlation in their models, mainly with two basic approaches. The first approach includes analytic or computational methodologies to assess directly the correlation; the second approach is adopted by techniques for the selection of a representative set of intensity maps, ofter referred to as “regional hazard-consistent maps”. The methodology presented herein, which branches out from the previous two approaches, considers the intensity maps as random fields. By adopting this abstract perspective, the new methodology is particularly appropriate for a multi-hazard approach, and it can take advantage of tools for the optimal sampling of multi-dimensional stochastic functions. These tools ensure that the weighted ensemble of generated samples (i.e., intensity maps) tends to match all the probabilistic properties of the field, including the correlation. In fact, the samples generated by the proposed methodology fully capture the marginal hazard at each location and the correlated regional hazard. After the technique is presented, an application is provided, for the case of seismic ground motion intensity maps.Non UBCUnreviewedThis collection contains the proceedings of ICASP12, the 12th International Conference on Applications of Statistics and Probability in Civil Engineering held in Vancouver, Canada on July 12-15, 2015. Abstracts were peer-reviewed and authors of accepted abstracts were invited to submit full papers. Also full papers were peer reviewed. The editor for this collection is Professor Terje Haukaas, Department of Civil Engineering, UBC Vancouver.FacultyResearche

    Concept mapping to measure mathematical experts’ number sense

    Get PDF
    The purpose of this study is to test whether concept mapping can be used for measuring the levels of number sense and use it to measure the mathematical experts’ level of number sense. The sample included 39 undergraduate and post-graduate students of Departments of Mathematics in Greece. A pencil and paper test was administered to test the level of number sense in different mathematical domains. Additionally, the participants were asked to create a concept map with 1/2 as the central term. The results showed low levels of number sense with the majority of the participants responded in the number sense test by applying rules and algorithms rather than more holistic approaches that would indicate higher levels of number sense. Additionally, participants’ performance in concept mapping was strongly related to their performance in the number sense test. Specifically, participants with low number sense scores tended to present poor concept maps

    Safeguarding food security: Hormesis-based plant priming to the rescue

    No full text
    Accumulating evidence suggests that the biphasic phenomenon of hormesis may provide the means for ensuring agricultural sustainability in changing climate scenarios. The adaptive responses induced in plants exposed to hormetic, low doses of various stressors can result in enhanced tolerance upon their subsequent exposure to adverse environmental stimuli. Hormesis-based priming is highly generalizable, as it can be induced by a series of effectors applied at different growth stages in a plethora of plant species. This review aims at highlighting the most promising hormesis-based priming approaches, based on natural as well as artificial environmental factors, including chemicals, radiation, and nanomaterials. Furthermore, we discuss research gaps, future perspectives and recommended actions for reaching specific milestones in the roadmap for achieving the target of commercializing the benefits of this platform, in order to secure agricultural sustainability

    Probabilistic simulation of power transmission systems affected by hurricane events based on fragility and AC power flow analyses

    Get PDF
    This paper presents a technique for probabilistic simulation of power transmission systems under hurricane events. The study models the power transmission system as a network of connected individual components, which are subjected to wind-induced mechanical failure and power flow constraints. The mechanical performance of the transmission conductors are evaluated using an efficient modal superpoistion method and extreme value analysis. The fragilty model is then developed using first order reliability theory. The asssumptions of the method are discussed, and its accuracy is thoroughly investigated. The component fragilities are used to map the damage of hurricane events to the failure probabilities. The electircal performance of the components is modeled through an AC-based power flow cascading failure model, to capture the unique phenomena affecting power systems, such as line overflow and load shedding. The methodology is demonstrated by a case study involving a hurricane moving across the IEEE 30-bus transmission network. This technique aims at helping decision makers gain fundamental insights on the modeling and quantification of power system performance during hurricane events.This work is part of the Probabilistic Resilience Assessment of Interdependent Systems (PRAISys) project (www.praisys.org). The support from the National Science Foundation through grant CMS-1541177 is gratefully acknowledged

    Systemic mitigation of salt stress by hydrogen peroxide and sodium nitroprusside in strawberry plants via transcriptional regulation of enzymatic and non-enzymatic antioxidants

    No full text
    Nitric oxide (NO) and hydrogen peroxide (H2O2) have a pivotal role in plant development and stress responses, thus rendering them as key molecules for priming approaches. In this study, a hydroponic experiment was employed in order to investigate the effects of NO donor, sodium nitroprusside (SNP; 100 μM), or H2O2 (10 mM) root pretreatment in major components of redox homeostasis and signaling of strawberry plants (Fragaria × ananassa cv. ‘Camarosa’) exposed immediately, or 7 d after root pretreatment, to salt stress (100 mM NaCl, 8 d). Plants stressed immediately after root pretreatment with either reactive species demonstrated increased chlorophyll fluorescence, photosynthetic pigment content, leaf relative water content as well as lower lipid peroxidation and electrolyte leakage levels in comparison with plants directly subjected to salt stress, suggesting a systemic mitigating effect of NO/H2O2 pretreatment to cellular damage derived from abiotic stress factors. In addition, primed plants managed to mitigate the oxidative and nitrosative secondary stress and redox homeostasis disturbances, since H2O2 and NO were quantified in lower levels, whereas ascorbate and glutathione redox states in leaves were sustained at higher rates, compared with NaCl treatment. Gene expression analysis revealed that priming effects of both H2O2 and NO root pretreatment correlated with increased transcript levels of enzymatic antioxidants (cAPX, CAT, GR, MnSOD, MDHAR and DHAR), as well as ascorbate (GaIUR, GLDH, GDH, MIOX) and glutathione biosynthesis (GCS, GS) in leaves, in contrast with the general transcriptional suppression observed in plants stressed without pretreatment, or 7 d after root pretreatment. Overall, pretreated plants displayed redox regulated defense responses leading to systemic tolerance to subsequent salt stress exposure
    corecore