101 research outputs found

    A pivotal-based approach for enterprise business process and IS integration

    Get PDF
    A company must be able to describe and react against any endogenous or exogenous event. Such flexibility can be achieved through business process management (BPM). Nevertheless a BPM approach highlights complex relations between business and IT domains. A non-alignment is exposed between heterogeneous models: this is the 18business-IT gap 19 as described in the literature. Through concepts from business engineering and information systems driven by models and IT, we define a generic approach ensuring multi-view consistency. Its role is to maintain and provide all information related to the structure and semantic of models. Allowing the full return of a transformed model in the sense of reverse engineering, our platform enables synchronisation between analysis model and implementation model

    A framework for integrating and transforming between ontologies and relational databases

    Get PDF
    Bridging the gap between ontologies, expressed in the Web Ontology Language (OWL), and relational databases is a necessity for realising the Semantic Web vision. Relational databases are considered a good solution for storing and processing ontologies with a large amount of data. Moreover, the vast majority of current websites store data in relational databases, and therefore being able to generate ontologies from such databases is important to support the development of the Semantic Web. Most of the work concerning this topic has either (1) extracted an OWL ontology from an existing relational database that represents as exactly as possible the relational schema, using a limited range of OWL modelling constructs, or (2) extracted a relational database from an existing OWL ontology, that represents as much as possible the OWL ontology. By way of contrast, this thesis proposes a general framework for transforming and mapping between ontologies and databases, via an intermediate low-level Hyper-graph Data Model. The transformation between relational and OWL schemas is expressed using directional Both-As-View mappings, allowing a precise definition of the equivalence between the two schemas, hence data can be mapped back and forth between them. In particular, for a given OWL ontology, we interpret the expressive axioms either as triggers, conforming to the Open-World Assumption, that performs a forward-chaining materialisation of inferred data, or as constraints, conforming to the Closed-World Assumption, that performs a consistency checking. With regards to extracting ontologies from relational databases, we transform a relational database into an exact OWL ontology, then enhance it with rich OWL 2 axioms, using a combination of schema and data analysis. We then apply machine learning algorithms to rank the suggested axioms based on past users’ relevance. A proof-of-concept tool, OWLRel, has been implemented, and a number of well-known ontologies and databases have been used to evaluate the approach and the OWLRel tool.Open Acces

    Applicability of climate-based daylight modelling

    Get PDF
    This PhD thesis evaluated the applicability of Climate-Based Daylight Modelling (CBDM) as it is presently done. The objectives stated in this thesis aimed at broadly assessing applicability by looking at multiple aspects: (i) the way CBDM is used by expert researchers and practitioners; (ii) how state-of-the-art simulation techniques compare to each other and how they are affected by uncertainty in input factors; (iii) how the simulated results compare with data measured in real occupied spaces. The answers obtained from a web-based questionnaire portrayed a variety of workflows used by different people to perform similar, if not the same, evaluations. At the same time, the inter-model comparison performed to compare the existing simulation techniques revealed significant differences in the way the sky and the sun are recreated by each technique. The results also demonstrated that some of the annual daylight metrics commonly required in building guidelines are sensitive to the choice of simulation tool, as well as other input parameters, such as climate data, orientation and material optical properties. All the analyses were carried out on four case study spaces, remodelled from existing classrooms that were the subject of a concurrent research study that monitored their interior luminous conditions. A large database of High Dynamic Range images was collected for that study, and the luminance data derived from these images could be used in this work to explore a new methodology to calibrate climate-based daylight models. The results collected and presented in this dissertation illustrate how, at the time of writing, there is not a single established common framework to follow when performing CBDM evaluations. Several different techniques coexist but each of them is characterised by a specific domain of applicability

    Planning for an unknown future: incorporating meteorological uncertainty into predictions of the impact of fires and dust on US particulate matter

    Get PDF
    2019 Summer.Includes bibliographical references.Exposure to particulate matter (PM) pollution has well documented health impacts and is regulated by the United States (U.S.) Environmental Protection Agency (EPA). In the U.S. wildfire smoke and wind-blown dust are significant natural sources of PM pollution. This dissertation shows how the environmental conditions that drive wildfires and wind-blown dust are likely to change in the future and what these changes imply for future PM concentrations. The first component of this dissertation shows how human ignitions and environmental conditions influence U.S. wildfire activity. Using wildfire burn area and ignition data, I find that in both the western and southeastern U.S., annual lightning- and human-ignited wildfire burn area have similar relationships with key environmental conditions (temperature, relative humidity, and precipitation). These results suggest that burn area for human- and lightning-ignited wildfires will be similarly impacted by climate change. Next, I quantify how the environmental conditions that drive wildfire activity are likely to change in the future under different climate scenarios. Coupled Model Intercomparison Project phase 5 (CMIP5) models agree that western U.S. temperatures will increase in the 21st century for representative concentration pathways (RCPs) 4.5 and 8.5. I find that averaged over seasonal and regional scales, other environmental variables demonstrated to be relevant to fuel flammability and aridity, such as precipitation, evaporation, relative humidity, root zone soil moisture, and wind speed, can be used to explain historical variability in wildfire burn area as well or better than temperature. My work demonstrates that when objectively selecting environmental predictors using Lasso regression, temperature is not always selected, but that this varies by western U.S. ecoregion. When temperature is not selected, the sign and magnitude of future changes in burn area become less certain, highlighting that predicted changes in burn area are sensitive to the environmental predictors chosen to predict burn area. Smaller increases in future wildfire burn area are estimated whenever and wherever the importance of temperature as a predictor is reduced. The second component of this dissertation examines how environmental conditions that drive fine dust emissions and concentrations in the southwestern U.S. change in the future. I examine environmental conditions that influence dust emissions including, temperature, vapor pressure deficit, relative humidity, precipitation, soil moisture, wind speed, and leaf area index (LAI). My work quantifies fine dust concentrations in the U.S. southwest dust season, March through July, using fine iron as a dust proxy, quantified with measurements from the Interagency Monitoring of PROtected Visual Environments (IMPROVE) network between 1995 and 2015. I show that the largest contribution to the spread in future dust concentration estimates comes from the choice of environmental predictor used to explain observed variability. The spread between different environmental predictor estimates can be larger than the spread between climate scenarios or intermodel spread. Based on linear estimates of how dust concentrations respond to changes in LAI, CMIP5 estimated increases in LAI would result in reduced dust concentrations in the future. However, when I objectively select environmental predictors of dust concentrations using Lasso regression, LAI is not selected in favor of other variables. When using a linear combination of objectively selected environmental variables, I estimate that future southwest dust season mean concentrations will increase by 0.24 μg m−3 (12%) by the end of the 21st century for RCP 8.5. This estimated increase in fine dust concentration is driven by decreases in relative humidity, precipitation, soil moisture, and buffered by decreased wind speeds

    A Development Method for the Conceptual Design of Multi-View Modeling Tools with an Emphasis on Consistency Requirements

    Get PDF
    The main objective of this thesis is to bridge the gap between modeling method experts on the one side and tool developers on the other. More precisely, the focus is on the specification of requirements for multi-view modeling tools. In this regard, the thesis introduces a methodological approach that supports the specification of conceptual designs for multi-view modeling tools in a stepwise manner: the MuVieMoT approach. MuVieMoT utilizes generic multi-view modeling concepts and the model-driven engineering paradigm to establish an overarching specification of multi-view modeling tools with an emphasis on consistency requirements. The approach builds on and extends the theoretical foundation of metamodeling and multi-view modeling: generic multi-view modeling concepts, integrated multi-view modeling approaches, and possibilities for formalized modeling method specifications. Applicability and utility of MuVieMoT are evaluated using an illustrative scenario, therefore specifying a conceptual design for a multi-view modeling tool for the Semantic Object Model enterprise modeling method. The thesis moreover introduces the MuVieMoT modeling environment, enabling the efficient application of the approach as well as the model-driven development of initial multi-view modeling tools based on the conceptual models created with MuVieMoT. Consequently, the approach fosters an intersubjective and unambiguous understanding of the tool requirements between method experts and tool developers

    Advanced perfusion quantification methods for dynamic PET and MRI data modelling

    Get PDF
    The functionality of tissues is guaranteed by the capillaries, which supply the microvascular network providing a considerable surface area for exchanges between blood and tissues. Microcirculation is affected by any pathological condition and any change in the blood supply can be used as a biomarker for the diagnosis of lesions and the optimization of the treatment. Nowadays, a number of techniques for the study of perfusion in vivo and in vitro are available. Among the several imaging modalities developed for the study of microcirculation, the analysis of the tissue kinetics of intravenously injected contrast agents or tracers is the most widely used technique. Tissue kinetics can be studied using different modalities: the positive enhancement of the signal in the computed tomography and in the ultrasound dynamic contrast enhancement imaging; T1-weighted MRI or the negative enhancement of T2* weighted MRI signal for the dynamic susceptibility contrast imaging or, finally, the uptake of radiolabelled tracers in dynamic PET imaging. Here we will focus on the perfusion quantification of dynamic PET and MRI data. The kinetics of the contrast agent (or the tracer) can be analysed visually, to define qualitative criteria but, traditionally, quantitative physiological parameters are extracted with the implementation of mathematical models. Serial measurements of the concentration of the tracer (or of the contrast agent) in the tissue of interest, together with the knowledge of an arterial input function, are necessary for the calculation of blood flow or perfusion rates from the wash-in and/or wash-out kinetic rate constants. The results depend on the acquisition conditions (type of imaging device, imaging mode, frequency and total duration of the acquisition), the type of contrast agent or tracer used, the data pre-processing (motion correction, attenuation correction, correction of the signal into concentration) and the data analysis method. As for the MRI, dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a non-invasive imaging technique that can be used to measure properties of tissue microvasculature. It is sensitive to differences in blood volume and vascular permeability that can be associated with tumour angiogenesis. DCE-MRI has been investigated for a range of clinical oncologic applications (breast, prostate, cervix, liver, lung, and rectum) including cancer detection, diagnosis, staging, and assessment of treatment response. Tumour microvascular measurements by DCE-MRI have been found to correlate with prognostic factors (such as tumour grade, microvessel density, and vascular endothelial growth factor expression) and with recurrence and survival outcomes. Furthermore, DCE-MRI changes measured during treatment have been shown to correlate with outcome, suggesting a role as a predictive marker. The accuracy of DCE-MRI relies on the ability to model the pharmacokinetics of an injected contrast agent using the signal intensity changes on sequential magnetic resonance images. DCE-MRI data are usually quantified with the application of the pharmacokinetic two-compartment Tofts model (also known as the standard model), which represents the system with the plasma and tissue (extravascular extracellular space) compartments and with the contrast reagent exchange rates between them. This model assumes a negligible contribution from the vascular space and considers the system in, what-is-known as, the fast exchange limit, assuming infinitely fast transcytolemmal water exchange kinetics. In general, the number, as well as any assumption about the compartments, depends on the properties of the contrast agent used (mainly gadolinium) together with the tissue physiology or pathology studied. For this reason, the choice of the model is crucial in the analysis of DCE-MRI data. The value of PET in clinical oncology has been demonstrated with studies in a variety of cancers including colorectal carcinomas, lung tumours, head and neck tumours, primary and metastatic brain tumours, breast carcinoma, lymphoma, melanoma, bone cancers, and other soft-tissue cancers. PET studies of tumours can be performed for several reasons including the quantification of tumour perfusion, the evaluation of tumour metabolism, the tracing of radiolabelled cytostatic agents. In particular, the kinetic analysis of PET imaging has showed, in the past few years, an increasing value in tumour diagnosis, as well as in tumour therapy, through providing additional indicative parameters. Many authors have showed the benefit of kinetic analysis of anticancer drugs after labelling with radionuclide in measuring the specific therapeutic effect bringing to light the feasibility of applying the kinetic analysis to the dynamic acquisition. Quantification methods can involve visual analysis together with compartmental modelling and can be applied to a wide range of different tracers. The increased glycolysis in the most malignancies makes 18F-FDG-PET the most common diagnostic method used in tumour imaging. But, PET metabolic alteration in the target tissue can depend by many other factors. For example, most types of cancer are characterized by increased choline transport and by the overexpression of choline kinase in highly proliferating cells in response to enhanced demand of phosphatidylcholine (prostate, breast, lung, ovarian and colon cancers). This effect can be diagnosed with choline-based tracers as the 18Ffluoromethylcholine (18F-FCH), or the even more stable 18F-D4-Choline. Cellular proliferation is also imaged with 18F-fluorothymidine (FLT), which is trapped within the cytosol after being mono phosphorylated by thymidine kinase-1 (TK1), a principal enzyme in the salvage pathway of DNA synthesis. 18F-FLT has been found to be useful for noninvasive assessment of the proliferation rate of several types of cancer and showed high reproducibility and accuracy in breast and lung cancer tumours. The aim of this thesis is the perfusion quantification of dynamic PET and MRI data of patients with lung, brain, liver, prostate and breast lesions with the application of advanced models. This study covers a wide range of imaging methods and applications, presenting a novel combination of MRI-based perfusion measures with PET kinetic modelling parameters in oncology. It assesses the applicability and stability of perfusion quantification methods, which are not currently used in the routine clinical practice. The main achievements of this work include: 1) the assessment of the stability of perfusion quantification of D4-Choline and 18F-FLT dynamic PET data in lung and liver lesions, respectively (first applications in the literature); 2) the development of a model selection in the analysis of DCE-MRI data of primary brain tumours (first application of the extended shutter speed model); 3) the multiparametric analysis of PET and MRI derived perfusion measurements of primary brain tumour and breast cancer together with the integration of immuohistochemical markers in the prediction of breast cancer subtype (analysis of data acquired on the hybrid PET/MRI scanner). The thesis is structured as follows: - Chapter 1 is an introductive chapter on cancer biology. Basic concepts, including the causes of cancer, cancer hallmarks, available cancer treatments, are described in this first chapter. Furthermore, there are basic concepts of brain, breast, prostate and lung cancers (which are the lesions that have been analysed in this work). - Chapter 2 is about Positron Emission Tomography. After a brief introduction on the basics of PET imaging, together with data acquisition and reconstruction methods, the chapter focuses on PET in the clinical settings. In particular, it shows the quantification techniques of static and dynamic PET data and my results of the application of graphical methods, spectral analysis and compartmental models on dynamic 18F-FDG, 18F-FLT and 18F-D4- Choline PET data of patients with breast, lung cancer and hepatocellular carcinoma. - Chapter 3 is about Magnetic Resonance Imaging. After a brief introduction on the basics of MRI, the chapter focuses on the quantification of perfusion weighted MRI data. In particular, it shows the pharmacokinetic models for the quantification of dynamic contrast enhanced MRI data and my results of the application of the Tofts, the extended Tofts, the shutter speed and the extended shutter speed models on a dataset of patients with brain glioma. - Chapter 4 introduces the multiparametric imaging techniques, in particular the combined PET/CT and the hybrid PET/MRI systems. The last part of the chapter shows the applications of perfusion quantification techniques on a multiparametric study of breast tumour patients, who simultaneously underwent DCE-MRI and 18F-FDG PET on a hybrid PET/MRI scanner. Then the results of a predictive study on the same dataset of breast tumour patients integrated with immunohistochemical markers. Furthermore, the results of a multiparametric study on DCE-MRI and 18F-FCM brain data acquired both on a PET/CT scanner and on an MR scanner, separately. Finally, it will show the application of kinetic analysis in a radiomic study of patients with prostate cancer
    • …
    corecore