1,982 research outputs found

    ExTrA: Explaining architectural design tradeoff spaces via dimensionality reduction

    Get PDF
    In software design, guaranteeing the correctness of run-time system behavior while achieving an acceptable balance among multiple quality attributes remains a challenging problem. Moreover, providing guarantees about the satisfaction of those requirements when systems are subject to uncertain environments is even more challenging. While recent developments in architectural analysis techniques can assist architects in exploring the satisfaction of quantitative guarantees across the design space, existing approaches are still limited because they do not explicitly link design decisions to satisfaction of quality requirements. Furthermore, the amount of information they yield can be overwhelming to a human designer, making it difficult to see the forest for the trees. In this paper we present ExTrA (Explaining Tradeoffs of software Architecture design spaces), an approach to analyzing architectural design spaces that addresses these limitations and provides a basis for explaining design tradeoffs. Our approach employs dimensionality reduction techniques employed in machine learning pipelines like Principal Component Analysis (PCA) and Decision Tree Learning (DTL) to enable architects to understand how design decisions contribute to the satisfaction of extra-functional properties across the design space. Our results show feasibility of the approach in two case studies and evidence that combining complementary techniques like PCA and DTL is a viable approach to facilitate comprehension of tradeoffs in poorly-understood design spaces

    Artificial intelligence and model checking methods for in silico clinical trials

    Get PDF
    Model-based approaches to safety and efficacy assessment of pharmacological treatments (In Silico Clinical Trials, ISCT) hold the promise to decrease time and cost for the needed experimentations, reduce the need for animal and human testing, and enable personalised medicine, where treatments tailored for each single patient can be designed before being actually administered. Research in Virtual Physiological Human (VPH) is harvesting such promise by developing quantitative mechanistic models of patient physiology and drugs. Depending on many parameters, such models define physiological differences among different individuals and different reactions to drug administrations. Value assignments to model parameters can be regarded as Virtual Patients (VPs). Thus, as in vivo clinical trials test relevant drugs against suitable candidate patients, ISCT simulate effect of relevant drugs against VPs covering possible behaviours that might occur in vivo. Having a population of VPs representative of the whole spectrum of human patient behaviours is a key enabler of ISCT. However, VPH models of practical relevance are typically too complex to be solved analytically or to be formally analysed. Thus, they are usually solved numerically within simulators. In this setting, Artificial Intelligence and Model Checking methods are typically devised. Indeed, a VP coupled together with a pharmacological treatment represents a closed-loop model where the VP plays the role of a physical subsystem and the treatment strategy plays the role of the control software. Systems with this structure are known as Cyber-Physical Systems (CPSs). Thus, simulation-based methodologies for CPSs can be employed within personalised medicine in order to compute representative VP populations and to conduct ISCT. In this thesis, we advance the state of the art of simulation-based Artificial Intelligence and Model Checking methods for ISCT in the following directions. First, we present a Statistical Model Checking (SMC) methodology based on hypothesis testing that, given a VPH model as input, computes a population of VPs which is representative (i.e., large enough to represent all relevant phenotypes, with a given degree of statistical confidence) and stratified (i.e., organised as a multi-layer hierarchy of homogeneous sub-groups). Stratification allows ISCT to adaptively focus on specific phenotypes, also supporting prioritisation of patient sub-groups in follow-up in vivo clinical trials. Second, resting on a representative VP population, we design an ISCT aiming at optimising a complex treatment for a patient digital twin, that is the virtual counterpart of that patient physiology defined by means of a set of VPs. Our ISCT employs an intelligent search driving a VPH model simulator to seek the lightest but still effective treatment for the input patient digital twin. Third, to enable interoperability among VPH models defined with different modelling and simulation environments and to increase efficiency of our ISCT, we also design an optimised simulator driver to speed-up backtracking-based search algorithms driving simulators. Finally, we evaluate the effectiveness of our presented methodologies on state-of-the-art use cases and validate our results on retrospective clinical data

    Temporal decomposition and semantic enrichment of mobility flows

    Get PDF
    Mobility data has increasingly grown in volume over the past decade as loc- alisation technologies for capturing mobility ows have become ubiquitous. Novel analytical approaches for understanding and structuring mobility data are now required to support the back end of a new generation of space-time GIS systems. This data has become increasingly important as GIS is now an essen- tial decision support platform in many domains that use mobility data, such as eet management, accessibility analysis and urban transportation planning. This thesis applies the machine learning method of probabilistic topic mod- elling to decompose and semantically enrich mobility ow data. This process annotates mobility ows with semantic meaning by fusing them with geograph- ically referenced social media data. This thesis also explores the relationship between causality and correlation, as well as the predictability of semantic decompositions obtained during a case study using a real mobility dataset

    From Snow to Flow: Exploring Relationships Between SNOTEL Ablation Curves and Peak Streamflow Timing

    Get PDF
    Predictions of peak streamflow timing in snow-dominated river systems are essential for proper water management and recreational availability. This study evaluates historic snow and streamflow data from 14 river basins throughout Idaho to investigate the relationship between snowmelt timing at SNOw TELemetry (SNOTEL) sites and peak streamflow within each basin. The goal is to provide a simple operational tool that estimates the probability of peak streamflow occurring within a certain number of days as ablation progresses from 0 to 100% melted. For individual basins we evaluate meltout levels in increments of 10% from each SNOTEL site and use a probabilistic modeling approach to create cumulative distribution function (CDF) curves which illustrate the probability of peak streamflow occurring within a given number of days from the date at which the SNOTEL site reaches each meltout percentage. Results from the CDF probability model graphs also provide basic information about basin specific anecdotal indices or “rules of thumb” for when peak streamflow will occur based on the average percent meltout at the time of peak streamflow. Compiled historical datasets with summary statistics for 54 SNOTEL-streamgage pairs of multiple snowmelt and streamflow metrics add to the body of knowledge of hydrologic processes for basins throughout Idaho. In addition, our analysis reveals how melt timing has a greater influence on the timing of peak streamflow than does the timing or magnitude of maximum accumulation (max SWE) and how the larger snowpack (magnitude of max SWE) often have few lag days between each meltout percentage and peak streamflow

    Explorando ferramentas de modelação digital, aumentada e orientada por dados em engenharia e design de produto

    Get PDF
    Tools are indispensable for all diligent professional practice. New concepts and possibilities for paradigm shifting are emerging with recent computational technological developments in digital tools. However, new tools from key concepts such as “Big-Data”, “Accessibility” and “Algorithmic Design” are fundamentally changing the input and position of the Product Engineer and Designer. After the context introduction, this dissertation document starts by extracting three pivotal criteria from the Product Design Engineering's State of the Art analysis. In each one of those criteria the new emergent, more relevant and paradigmatic concepts are explored and later on are positioned and compared within the Product Lifecycle Management wheel scheme, where the potential risks and gaps are pointed to be explored in the experience part. There are two types of empirical experiences: the first being of case studies from Architecture and Urban Planning — from the student's professional experience —, that served as a pretext and inspiration for the experiments directly made for Product Design Engineering. First with a set of isolated explorations and analysis, second with a hypothetical experience derived from the latter and, finally, a deliberative section that culminate in a listing of risks and changes concluded from all the previous work. The urgency to reflect on what will change in that role and position, what kind of ethical and/or conceptual reformulations should exist for the profession to maintain its intellectual integrity and, ultimately, to survive, are of the utmost evidence.As ferramentas são indispensáveis para toda a prática diligente profissional. Novos conceitos e possibilidades de mudança de paradigma estão a surgir com os recentes progressos tecnológicos a nível computacional nas ferramentas digitais. Contudo, novas ferramentas originadas sobre conceitos-chave como “Big Data”, “Acessibilidade” e “Design Algorítmico” estão a mudar de forma fundamental o contributo e posição do Engenheiro e Designer de Produto. Esta dissertação, após uma primeira introdução contextual, começa por extrair três conceitos-eixo duma análise ao Estado da Arte actual em Engenharia e Design de Produto. Em cada um desses conceitos explora-se os novos conceitos emergentes mais relevantes e paradigmáticos, que então são comparados e posicionados no círculo de Gestão de Ciclo de Vida de Produto, apontando aí potenciais riscos e falhas que possam ser explorados em experiências. As experiências empíricas têm duas índoles: a primeira de projetos e casos de estudo de arquitetura e planeamento urbanístico — experiência em contexto de trabalho do aluno —, que serviu de pretexto e inspiração para as experiências relacionadas com Engenharia e Design de Produto. Primeiro com uma série de análises e experiências isoladas, segundo com uma formulação hipotética com o compêndio dessas experiências e, finalmente, com uma secção de reflexão que culmina numa série de riscos e mudanças induzidas do trabalho anterior. A urgência em refletir sobre o que irá alterar nesse papel e posição, que género de reformulações éticas e/ou conceptuais deverão existir para que a profissão mantenha a sua integridade intelectual e, em última instância, sobreviva, são bastante evidentes.Mestrado em Engenharia e Design de Produt

    Development of crystallographic methods for phasing highly modulated macromolecular structures

    Get PDF
    [eng] Pathologies that result in highly modulated intensities in macromolecular crystal structures pose a challenge for structure solution. To address this issue two studies have been performed: a theoretical study of one of these pathologies, translational non- crystallographic symmetry (tNCS), and a practical study of paradigms of highly modulated macromolecular structures, coiled-coils. tNCS is a structural situation in which multiple, independent copies of a molecular assembly are found in similar orientations in the crystallographic asymmetric unit. Structure solution is problematic because the intensity modulations caused by tNCS cause the intensity distribution to differ from a Wilson distribution. If the tNCS is properly detected and characterized, expected intensity factors for each reflection that model the modulations observed in the data can be refined against a likelihood function to account for the statistical effects of tNCS. In this study, a curated database of 80482 protein structures from the PDB was analysed to investigate how tNCS manifests in the Patterson function. These studies informed the algorithm for detection of tNCS, which includes a method for detecting the tNCS order in any commensurate modulation. In the context of automated structure solution pipelines, the algorithm generates a ranked list of possible tNCS associations in the asymmetric unit, which can be explored to efficiently maximize the probability of structure solution. Coiled-coils are ubiquitous protein folding motifs present in a wide range of proteins that consist of two or more α-helices wrapped around each other to form a supercoil. Despite the apparent simplicity of their architecture, solution by molecular replacement is challenging due to the helical irregularities found in these domains, tendency to form fibers, large dimensions in their typically anisometric asymmetric units, low-resolution and anisotropic diffraction. In addition, the internal symmetry of the helices and their alignment in preferential directions gives rise to systematic overlap of Patterson vectors, a Patterson map that indicates tNCS is present, and intensity modulations similar to those in true tNCS. In this study, we have explored fragment phasing on a pool of 150 coiled-coils with ARCIMBOLDO_LITE, an ab initio phasing approach that combines fragment location with Phaser and density modification and autotracing with SHELXE. The results have been used to identify limits and bottlenecks in coiled-coil phasing that have been addressed in a specific mode for solving coiled-coils, allowing the solution of 95% of the test set and four previously unknown structures, and extending the resolution limit from 2.5 Å to 3.0 Å
    corecore