55 research outputs found

    Clinical and molecular biological studies in recurrent Aphthous Stomatitis

    Get PDF
    The aim of these studies was to research different aspects of the pathogenesis and therapeutic features of recurrent aphthous stomatitis. In support of the involvement of viruses in the aetiology of recurrent aphthous stomatitis, the nested PCR and assays of ELISA and IFA were employed. Results of PCR investigations showed that HHV-6 DNA was present in 29 per cent of aphthous lesions. Using ELISA, specific IgG antibodies against HHV-6 were detected in 96.7 per cent of all serum samples with no significant difference between aphthous patients, oral lichen planus or control subjects. Specific IgM antibodies against HHV-6 were found in a higher prevalence rate in aphthous samples compared with the two other groups: a significant difference of p=0.01 was found between sera of aphthous patients compared with healthy controls. HCMV and VZV DNA were not detected in aphthous samples. Also serological findings did not show any significant increase in the prevalence of specific IgG antibodies against these two viruses. Serum IgM antibodies against HCMV were positive in a small number of samples with no difference between groups and IgM antibody against VZV was not positive in any serum samples. These data fail to show that recurrent aphthous stomatitis can be a manifestation of VZV or HCMV infection or reactivation. However, the possibility of involvement of HHV-6 is raised by the present studies. The possible involvement of Mycobacterium paratuberculosis was examined by the nested PCR investigations. Although mycobacterial DNA was detected only in four biopsy samples of aphthous patients and in none of the oral lichen planus patients or controls, this difference was not significant and more research is necessary to confirm such involvement

    Towards Personalized and Human-in-the-Loop Document Summarization

    Full text link
    The ubiquitous availability of computing devices and the widespread use of the internet have generated a large amount of data continuously. Therefore, the amount of available information on any given topic is far beyond humans' processing capacity to properly process, causing what is known as information overload. To efficiently cope with large amounts of information and generate content with significant value to users, we require identifying, merging and summarising information. Data summaries can help gather related information and collect it into a shorter format that enables answering complicated questions, gaining new insight and discovering conceptual boundaries. This thesis focuses on three main challenges to alleviate information overload using novel summarisation techniques. It further intends to facilitate the analysis of documents to support personalised information extraction. This thesis separates the research issues into four areas, covering (i) feature engineering in document summarisation, (ii) traditional static and inflexible summaries, (iii) traditional generic summarisation approaches, and (iv) the need for reference summaries. We propose novel approaches to tackle these challenges, by: i)enabling automatic intelligent feature engineering, ii) enabling flexible and interactive summarisation, iii) utilising intelligent and personalised summarisation approaches. The experimental results prove the efficiency of the proposed approaches compared to other state-of-the-art models. We further propose solutions to the information overload problem in different domains through summarisation, covering network traffic data, health data and business process data.Comment: PhD thesi

    Summarization and Evaluation; Where are we today?!

    Get PDF
    PACLIC 21 / Seoul National University, Seoul, Korea / November 1-3, 200

    Dynamics, Control and Extremum Seeking of the Rectisol Process

    Get PDF
    Pendant la dernière décennie, les bioraffineries basées sur la gazéification ont fait l’objet de nombreuses études dans le cadre des efforts mondiaux visant à remplacer les combustibles fossiles qui produisent de l’énergie et des produits chimiques à valeur ajoutée. Une partie importante de ces bioraffineries est l’unité de purification des gaz de synthèse issus de l’oxydation partielle, qui enlève le CO2 et l’H2S. Un des procédés de purification considéré dans ces études est le Rectisol. Ce procédé est utilisé car il est plus environnemental et requière moins de coûts d’investissement et d’opération par rapport à d’autres procédés similaires. Afin de faire l’étude dynamique de ce procédé, une simulation en régime permanent à d’abord, été menée à l’aide du logiciel Aspen plus R. ----------ABSTRACT Gasification based biorefineries have been studied in the past decade as part of a global e↵ort to replace fossil fuels to produce energy and added value chemicals. An important part of these biorefineries is the acid gas removal units, that remove CO2 and H2S from the produced synthesis gas. One of the acid gas removal processes associated in these studies is Rectisol. Rectisol has been chosen since it’s environmental friendly and requires a lower amount of operational and capital costs compared to its opponents. To carry out a dynamic study of the process, as a first step, a steady-state simulation was carried out in Aspen Plus

    Production planning in industrial townships modeled as hub location-allocation problems considering congestion in manufacturing plants

    No full text
    In this paper, we aim to develop optimal production plans in industrial townships modeled as hub location-allocation problems (HLAP) taking congestion into account. In the proposed model, hub nodes are considered as industrial townships where manufacturing plants and a central distribution warehouse are located, and two objectives are targeted. The first is to minimize the total costs, which includes the cost of hub deployment, factories and warehouses, transportation, and so forth. The second is to minimize the total elapsed time of products in manufacturing plants and warehouses modeled as queues. Due to the ambiguity in estimating the model's parameters, they are considered as fuzzy parameters to make model closer to reality. The fuzzy model is then converted into an equivalent crisp model by combining the expected value (EV) and the fuzzy chance constrained programming (FCCP) approaches. Subsequently, the bi-objective crisp model is converted into a single aggregated objective model. In order to validate the proposed model, six numerical examples are solved, and the sensitivity of the proposed model with regard to changes in model's parameters is investigated

    Summarization and Evaluation; Where are we today?!

    No full text

    A biË—objective hub location-allocation model considering congestion

    No full text
    In this paper, a new hub location-allocation model is developed considering congestion and production scheduling. This model assumes that manufacturing and distributing goods, including raw materials and semi-finished or finished goods, take place in hubs only (such as industrial township). The main objective of this study is to minimize the total costs and to minimize the sum of waiting times for processing goods in factories and warehouses. In order to solve the bi-objective model, goal attainment and LP metric techniques are combined to develop a more effective multi-objective technique. Due to the exponential complexity of the proposed approach as well as the nonlinearity of the mathematical model, a number of small and medium-sized problems are solved to demonstrate the effectiveness of the solution methodology
    • …
    corecore