32 research outputs found

    Understanding the undelaying mechanism of HASubtyping in the level of physic-chemal characteristics of protein

    Get PDF
    The evolution of the influenza A virus to increase its host range is a major concern worldwide. Molecular mechanisms of increasing host range are largely unknown. Influenza surface proteins play determining roles in reorganization of host-sialic acid receptors and host range. In an attempt to uncover the physic-chemical attributes which govern HA subtyping, we performed a large scale functional analysis of over 7000 sequences of 16 different HA subtypes. Large number (896) of physic-chemical protein characteristics were calculated for each HA sequence. Then, 10 different attribute weighting algorithms were used to find the key characteristics distinguishing HA subtypes. Furthermore, to discover machine leaning models which can predict HA subtypes, various Decision Tree, Support Vector Machine, Naïve Bayes, and Neural Network models were trained on calculated protein characteristics dataset as well as 10 trimmed datasets generated by attribute weighting algorithms. The prediction accuracies of the machine learning methods were evaluated by 10-fold cross validation. The results highlighted the frequency of Gln (selected by 80% of attribute weighting algorithms), percentage/frequency of Tyr, percentage of Cys, and frequencies of Try and Glu (selected by 70% of attribute weighting algorithms) as the key features that are associated with HA subtyping. Random Forest tree induction algorithm and RBF kernel function of SVM (scaled by grid search) showed high accuracy of 98% in clustering and predicting HA subtypes based on protein attributes. Decision tree models were successful in monitoring the short mutation/reassortment paths by which influenza virus can gain the key protein structure of another HA subtype and increase its host range in a short period of time with less energy consumption. Extracting and mining a large number of amino acid attributes of HA subtypes of influenza A virus through supervised algorithms represent a new avenue for understanding and predicting possible future structure of influenza pandemics.Mansour Ebrahimi, Parisa Aghagolzadeh, Narges Shamabadi, Ahmad Tahmasebi, Mohammed Alsharifi, David L. Adelson, Farhid Hemmatzadeh, Esmaeil Ebrahimi

    Regulation of sensory nerve conduction velocity (SCV) of human bodies responding to annual temperature variations in natural environments

    Get PDF
    The extensive research interests in environmental temperature can be linked to human productivity/performance as well as comfort and health; while the mechanisms of physiological indices responding to temperature variations remain incompletely understood. This study adopted a physiological sensory nerve conduction velocity (SCV) as a temperature-sensitive biomarker to explore the thermoregulatory mechanisms of human responding to annual temperatures. The measurements of subjects’ SCV (over 600 samples) were conducted in a naturally ventilated environment over all four seasons. The results showed a positive correlation between SCV and annual temperatures and a Boltzmann model was adopted to depict the S-shaped trend of SCV with operative temperatures from 5 °C to 40 °C. The SCV increased linearly with operative temperatures from 14.28 °C to 20.5 °C and responded sensitively for 10.19 °C - 24.59 °C, while tended to be stable beyond that. The subjects’ thermal sensations were linearly related to SCV, elaborating the relation between human physiological regulations and subjective thermal perception variations. The findings reveal the body SCV regulatory characteristics in different operative temperature intervals, thereby giving a deeper insight into human autonomic thermoregulation and benefiting for built environment designs, meantime minimizing the temperature-invoked risks to human health and well-being

    Emergent realities for social wellbeing : environmental, spatial and social pathways

    Get PDF
    A book published by the Department of Criminology, Faculty for Social Wellbeing, University of Malta focusing on environmental, spatial and social fathways for social wellbeing.peer-reviewe

    Energy flexibility for heating and cooling based on seasonal occupant thermal adaptation in mixed-mode residential buildings

    Get PDF
    The energy flexibility for heating and cooling has not been fully explored though human thermal adaptation has been acknowledged in achieving energy savings in buildings. The aim of this study is to explore the capacity for heating/cooling flexibility in residential buildings in the hot summer and cold winter zone in China, by investigating the year-round dynamic changes in the thermal adaptation of occupants. A 13,005-set data set was extracted from a nation-wide field survey database. The results showed that the measured indoor temperatures were linearly related to the outdoor temperature in transient seasons but were discrete in the summer/winter seasons due to the mixed-mode operations of heating/cooling devices. The occupants’ neutral temperatures varied with outdoor temperatures in step with seasonal changes. Flexibility of temperature settings during the whole heating and cooling periods have been demonstrated, incorporating the dynamic thermal adaptation changes of occupants; such implementation has been estimated with great energy saving potential (e.g. 34.4% in Nanjing). This work contributes to the quantitative understanding of the role of human thermal adaptation in the smart control of residential energy management. It provides evidence for policy-making for flexible thermal design codes in building, to discourage excessive cooling/heating demands

    Development of an integrated decision support system for supporting offshore oil spill response in harsh environments

    Get PDF
    Offshore oil spills can lead to significantly negative impacts on socio-economy and constitute a direct hazard to the marine environment and human health. The response to an oil spill usually consists of a series of dynamic, time-sensitive, multi-faceted and complex processes subject to various constraints and challenges. In the past decades, many models have been developed mainly focusing on individual processes including oil weathering simulation, impact assessment, and clean-up optimization. However, to date, research on integration of offshore oil spill vulnerability analysis, process simulation and operation optimization is still lacking. Such deficiency could be more influential in harsh environments. It becomes noticeably critical and urgent to develop new methodologies and improve technical capacities of offshore oil spill responses. Therefore, this proposed research aims at developing an integrated decision support system for supporting offshore oil spill responses especially in harsh environments (DSS-OSRH). Such a DSS consists of offshore oil spill vulnerability analysis, response technologies screening, and simulation-optimization coupling. The uncertainties and/or dynamics have been quantitatively reflected throughout the modeling processes. First, a Monte Carlo simulation based two-stage adaptive resonance theory mapping (MC-TSAM) approach has been developed. A real-world case study was applied for offshore oil spill vulnerability index (OSVI) classification in the south coast of Newfoundland to demonstrate this approach. Furthermore, a Monte Carlo simulation based integrated rule-based fuzzy adaptive resonance theory mapping (MC-IRFAM) approach has been developed for screening and ranking for spill response and clean-up technologies. The feasibility of the MC-IRFAM was tested with a case of screening and ranking response technologies in an offshore oil spill event. A novel Monte Carlo simulation based dynamic mixed integer nonlinear programming (MC-DMINP) approach has also been developed for the simulation-optimization coupling in offshore oil spill responses. To demonstrate this approach, a case study was conducted in device allocation and oil recovery in an offshore oil spill event. Finally, the DSS-OSRH has been developed based on the integration of MC-TSAM, MC-IRFAM, AND MC-DSINP. To demonstrate its feasibility, a case study was conducted in the decision support during offshore oil spill response in the south coast of Newfoundland. The developed approaches and DSS are the first of their kinds to date targeting offshore oil spill responses. The novelty can be reflected from the following aspects: 1) an innovative MC-TSAM approach for offshore OSVI classification under complexity and uncertainty; 2) a new MC-IRFAM approach for oil spill response technologies classification and ranking with uncertain information; 3) a novel MC-DMINP simulation-optimization coupling approach for offshore oil spill response operation and resource allocation under uncertainty; and 4) an innovational DSS-OSRH which consists of the MC-TSAM, MC-IRFAM, MC-DMINP, supporting decision making throughout the offshore oil spill response processes. These methods are particularly suitable for offshore oil spill responses in harsh environments such as the offshore areas of Newfoundland and Labrador (NL). The research will also promote the understanding of the processes of oil transport and fate and the impacts to the affected offshore and shoreline area. The methodologies will be capable of providing modeling tools for other related areas that require timely and effective decisions under complexity and uncertainty

    Data cleaning techniques for software engineering data sets

    Get PDF
    Data quality is an important issue which has been addressed and recognised in research communities such as data warehousing, data mining and information systems. It has been agreed that poor data quality will impact the quality of results of analyses and that it will therefore impact on decisions made on the basis of these results. Empirical software engineering has neglected the issue of data quality to some extent. This fact poses the question of how researchers in empirical software engineering can trust their results without addressing the quality of the analysed data. One widely accepted definition for data quality describes it as `fitness for purpose', and the issue of poor data quality can be addressed by either introducing preventative measures or by applying means to cope with data quality issues. The research presented in this thesis addresses the latter with the special focus on noise handling. Three noise handling techniques, which utilise decision trees, are proposed for application to software engineering data sets. Each technique represents a noise handling approach: robust filtering, where training and test sets are the same; predictive filtering, where training and test sets are different; and filtering and polish, where noisy instances are corrected. The techniques were first evaluated in two different investigations by applying them to a large real world software engineering data set. In the first investigation the techniques' ability to improve predictive accuracy in differing noise levels was tested. All three techniques improved predictive accuracy in comparison to the do-nothing approach. The filtering and polish was the most successful technique in improving predictive accuracy. The second investigation utilising the large real world software engineering data set tested the techniques' ability to identify instances with implausible values. These instances were flagged for the purpose of evaluation before applying the three techniques. Robust filtering and predictive filtering decreased the number of instances with implausible values, but substantially decreased the size of the data set too. The filtering and polish technique actually increased the number of implausible values, but it did not reduce the size of the data set. Since the data set contained historical software project data, it was not possible to know the real extent of noise detected. This led to the production of simulated software engineering data sets, which were modelled on the real data set used in the previous evaluations to ensure domain specific characteristics. These simulated versions of the data set were then injected with noise, such that the real extent of the noise was known. After the noise injection the three noise handling techniques were applied to allow evaluation. This procedure of simulating software engineering data sets combined the incorporation of domain specific characteristics of the real world with the control over the simulated data. This is seen as a special strength of this evaluation approach. The results of the evaluation of the simulation showed that none of the techniques performed well. Robust filtering and filtering and polish performed very poorly, and based on the results of this evaluation they would not be recommended for the task of noise reduction. The predictive filtering technique was the best performing technique in this evaluation, but it did not perform significantly well either. An exhaustive systematic literature review has been carried out investigating to what extent the empirical software engineering community has considered data quality. The findings showed that the issue of data quality has been largely neglected by the empirical software engineering community. The work in this thesis highlights an important gap in empirical software engineering. It provided clarification and distinctions of the terms noise and outliers. Noise and outliers are overlapping, but they are fundamentally different. Since noise and outliers are often treated the same in noise handling techniques, a clarification of the two terms was necessary. To investigate the capabilities of noise handling techniques a single investigation was deemed as insufficient. The reasons for this are that the distinction between noise and outliers is not trivial, and that the investigated noise cleaning techniques are derived from traditional noise handling techniques where noise and outliers are combined. Therefore three investigations were undertaken to assess the effectiveness of the three presented noise handling techniques. Each investigation should be seen as a part of a multi-pronged approach. This thesis also highlights possible shortcomings of current automated noise handling techniques. The poor performance of the three techniques led to the conclusion that noise handling should be integrated into a data cleaning process where the input of domain knowledge and the replicability of the data cleaning process are ensured.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Study of Aspergillus fumigatus pathogenicity and identification of putative virulence genes

    Get PDF
    246 p.Aspergillus fumigatus is considered to be the most prevalent airborne pathogenic fungus. lts infection begins with conidial germination in lungs where different types of aspergillosis can be developed. In profoundly immunocompromised individuals this pathogen can even disseminate via the bloodstream, resulting in a disseminated disease in which other organs might be infected. With the airo of improving our understanding of the pathogenicity mechanisms, a new expression microarray covering the entire genome of A. fumigatus was designed to analyze its transcriptome at the first steps of germination and along a disseminated infection. According to transcriptomic results, essential elements such as nitrogen, iron and zinc seemed to be available enough during the infection, raising doubts about the potential of metal uptake systems as therapeutic or diagnostic methods. However, genes involved in carbohydrate and secondary metabolism, and even those encoding unclassified and hypothetical proteins, stood out as indispensable pathways for extending the infection. This study also allowed the identification of genes whose expressions were enhanced along the infectious process and the implication in virulence of two of them was studied. The deletion of the abrl/ brownl gene developed an isolate with an increase in cell damage, suggesting that this knockout might produce and secrete a toxic intermediate. On the contrary, when a transcription factor was deleted a decrease in cell damage was observed, highlighting its implication in A. fumigatus pathogenecity. Nevertheless, more studies need to be done to assess whether any of them show attenuation in virulence and to shed light on the biological functions they are involved, which could lead to identify potentially targets to develop diagnostic or treatment strategies

    La traduzione specializzata all’opera per una piccola impresa in espansione: la mia esperienza di internazionalizzazione in cinese di Bioretics© S.r.l.

    Get PDF
    Global markets are currently immersed in two all-encompassing and unstoppable processes: internationalization and globalization. While the former pushes companies to look beyond the borders of their country of origin to forge relationships with foreign trading partners, the latter fosters the standardization in all countries, by reducing spatiotemporal distances and breaking down geographical, political, economic and socio-cultural barriers. In recent decades, another domain has appeared to propel these unifying drives: Artificial Intelligence, together with its high technologies aiming to implement human cognitive abilities in machinery. The “Language Toolkit – Le lingue straniere al servizio dell’internazionalizzazione dell’impresa” project, promoted by the Department of Interpreting and Translation (Forlì Campus) in collaboration with the Romagna Chamber of Commerce (Forlì-Cesena and Rimini), seeks to help Italian SMEs make their way into the global market. It is precisely within this project that this dissertation has been conceived. Indeed, its purpose is to present the translation and localization project from English into Chinese of a series of texts produced by Bioretics© S.r.l.: an investor deck, the company website and part of the installation and use manual of the Aliquis© framework software, its flagship product. This dissertation is structured as follows: Chapter 1 presents the project and the company in detail; Chapter 2 outlines the internationalization and globalization processes and the Artificial Intelligence market both in Italy and in China; Chapter 3 provides the theoretical foundations for every aspect related to Specialized Translation, including website localization; Chapter 4 describes the resources and tools used to perform the translations; Chapter 5 proposes an analysis of the source texts; Chapter 6 is a commentary on translation strategies and choices

    Searching for the origin of samples of geological, environmental and archaeological interest: organic biomarkers as key evidences

    Get PDF
    266 p.Los biomarcadores orgánicos son compuestos que se caracterizan por su elevada estabilidad a lo largo del tiempo, por tanto, se emplean en múltiples campos de aplicación ya que pueden proporcionar una información muy valiosa acerca del origen de los mismos. Debido a la baja concentración a la que los biomarcadores se encuentran en las muestras, especialmente en aquellas relacionadas con la geoquímica, el medio ambiente y la arqueología, surge la necesidad del desarrollar métodos analíticos específicos para su determinación. En este sentido, los principales objetivos de esta tesis consistieron en el desarrollo de una serie de métodos analíticos sensibles, precisos y selectivos, basados en la detección por cromatografía de gases-espectrometría de masas (GC-MS) para la determinación de biomarcadores orgánicos en muestras de interés geológico, medio ambiental y arqueológico. Una vez optimizados y validados, los métodos se aplicaron a muestras reales pertenecientes a los campos mencionados. Por un lado, se analizaron muestras de beachrock, y la detección de una serie de hidrocarburos policíclicos aromáticos (PAHs) y otros biomarcadores específicos permitieron aclarar la formación inusual de este fenómeno en latitudes más templadas. Por otra parte, se determinó si las actividades industriales del puerto de Bilbao tienen influencia sobre la conservación del patrimonio construido, concretamente las galerías de Punta Begoña (Getxo), mediante el análisis de ácidos dicarboxílicos en muestras de aerosol marino y en los morteros del edificio. Finalmente, se analizaron una serie de cerámicas arqueológicas relacionadas con la producción de vino (s. II-I a.C.) y el almacenamiento de grasa de ballena (s. XVI-XVII) para detectar aquellos biomarcadores específicos que lo confirmen.IBEA: Ikerkuntza eta Berrikuntza Analitiko
    corecore