28 research outputs found

    A Hybrid Approach to Logic Evaluation

    Get PDF
    In this thesis, we contribute the hybrid approach – a means of combining the practical advantages of feature-rich logic evaluation in the cloud, with the performance benefits of hand-written, optimized, efficient native code. In the first part of our hybrid approach, we introduce a cloud-based distribution for logic programs, which may be deployed as a service, in standard cloud environments, across cheap commodity hardware. Modern systems are in the cloud; while distributed logic solvers exist, these systems are highly specialized, requiring expensive, resource intensive hardware infrastructures. Our original technique achieves a fully automatic synthesis of cloud infrastructure for logic programs, and includes a range of practical features not present in existing distributed logic solvers. We show that an implementation of the distribution scales effectively within real-world cloud environments, against a distribution over cores of the same machine. We show that our multi-node distribution may be effectively combined with existing multi-threaded techniques to mitigate the network communication cost incurred by distribution. In the second part of our hybrid approach, we introduce extra-logical algorithms, to achieve performance for logic programs that would not be possible within a bottom-up logic evaluation. Modern systems must deliver high performance on big data; however, even the most powerful logic engines, distributed or otherwise, can be beaten by hand-written code on particular problems. We give a novel implementation of a system for the high-impact problem of sink-reachability, designed such that its algorithms may be used in logic programs. A thorough empirical evaluation, across a range of large-scale, real-world datasets, shows our system outperforms the current state of the art for the sink-reachability problem in all cases. Our hybrid approach addresses the two major deficiencies of modern logic systems, providing a practical means of evaluating logic in distributed cloud-based environments, while offering performance gains for specific high-impact problems that would not be possible using logic programming alone

    Concurrent Execution of Mutually Exclusive Alternatives

    Get PDF
    We examine the task of concurrently computing alternative solutions to a problem. We restrict our interest to the case where only one solution is needed: in this case we need some rule for selecting between the solutions. We use "fastest first," where the first successful alternative is selected. For problems where the required execution time is unpredictable this method can show substantial execution time performance increases. These increases are dependent on the mean execution time of the alternatives, the fastest execution time, the overhead involved in concurrent computation, and the overhead of selecting and deleting alternatives. Rather than using the traditional approach of multiple computers cooperating on the solution to a problem, this method achieves a solution competitively. Among the problems with exploring multiple alternatives in parallel are side-effects and combinatorial explosion in the amount of state which must be preserved. These are solved by process management and an application of "copy-on-write" virtual memory management. The side effects resulting from interprocess communication are handled by a specialized message layer which interacts with process management. We show how the scheme for parallel execution can be applied to several application areas. The applications are distributed execution of recovery blocks, OR-parallelism in Prolog, and polynomial root-finding

    Similarity Problems in High Dimensions

    Get PDF

    A Study of Adaptation Mechanisms for Simulation Algorithms

    Get PDF
    The performance of a program can sometimes greatly improve if it was known in advance the features of the input the program is supposed to process, the actual operating parameters it is supposed to work with, or the specific environment it is to run on. However, this information is typically not available until too late in the program’s operation to take advantage of it. This is especially true for simulation algorithms, which are sensitive to this late-arriving information, and whose role in the solution of decision-making, inference and valuation problems is crucial. To overcome this limitation we need to provide the flexibility for a program to adapt its behaviour to late-arriving information once it becomes available. In this thesis, I study three adaptation mechanisms: run-time code generation, model-specific (quasi) Monte Carlo sampling and dynamic computation offloading, and evaluate their benefits on Monte Carlo algorithms. First, run-time code generation is studied in the context of Monte Carlo algorithms for time-series filtering in the form of the Input-Adaptive Kalman filter, a dynamically generated state estimator for non-linear, non-Gaussian dynamic systems. The second adaptation mechanism consists of the application of the functional-ANOVA decomposition to generate model-specific QMC-samplers which can then be used to improve Monte Carlo-based integration. The third adaptive mechanism treated here, dynamic computation offloading, is applied to wireless communication management, where network conditions are assessed via option valuation techniques to determine whether a program should offload computations or carry them out locally in order to achieve higher run-time (and correspondingly battery-usage) efficiency. This ability makes the program well suited for operation in mobile environments. At their core, all these applications carry out or make use of (quasi) Monte Carlo simulations on dynamic Bayesian networks (DBNs). The DBN formalism and its associated simulation-based algorithms are of great value in the solution to problems with a large uncertainty component. This characteristic makes adaptation techniques like those studied here likely to gain relevance in a world where computers are endowed with perception capabilities and are expected to deal with an ever-increasing stream of sensor and time-series data

    A systems biology approach to musculoskeletal tissue engineering: transcriptomic and proteomic analysis of cartilage and tendon cells

    Get PDF
    Disorders of cartilage and tendon account for a high incidence of disability and are highly prevalent co-morbidities within the ageing population; therefore, musculoskeletal disorders represent a major public health policy issue. Despite considerable efforts to characterise biochemical and biomechanical cues that promote a stable differentiated cartilage or tendon phenotype in vitro the benchmarks by which progress is measured are limited. Common regenerative interventions, such as autologous cartilage implantation, have a required period of monolayer expansion that induces a loss of the functional phenotype, termed dedifferentiation. Dedifferentiation has no definitive mechanism yet is widely described in both regenerative and degenerative contexts; in addition to stem cell transplantation and cell-seeding in three-dimensional scaffolds, dedifferentiation represents the third approach to the development of regenerative mechanisms for mammalian tissue repair. Cartilage and tendon show a number of common features in structure, develop, disease, and repair. The extracellular matrix is a dynamic and complex structure that confers the functional mechanical properties of cartilage and tendon. Dysregulation of production and degradation are critical to the pathophysiology of musculoskeletal disorders, therefore, reparative interventions require a stable, functional phenotype from the outset. Cartilage and tendon demonstrate a commonality in terms of function defining structure both being sparsely cellular with a preponderance of collagenous matrix. Parity of functionality with the pre- injury state after healing is rarely achieved for cartilage and tendon. Cartilage and  tendon also share common embryological origins. Common mesenchymal progenitor cells differentiate into many musculoskeletal tissues with diverse functions. Specialist sub-populations of tendon and cartilage progenitors enable formation of transitional zones between these developing tissues. The development of musculoskeletal structures does not occur in isolation, however, cartilage and tendon have not previously been considered together in a systems context. An integrated understanding of the differentiation of these tissues should inform regenerative therapies and tissue engineering strategies. Systems biology is paradigm shift in scientific thinking where traditional reductionist strategies to complex biological problems have been superseded by a holistic philosophy seeking to understand the emergent behavior of a system by the integrative and predictive modeling of all elements of that system. Whole transcriptome and proteome profiling studies are used to collect quantitative data about a system, which may then be exploited by systems biology methodologies including the analysis of gene and protein networks. Gene-gene co-expression relationships, which are core regulatory mechanisms in biology, are often not part of a comprehensive gene expression analysis. Many biological networks are sparse and have a scale-free topology, which generally indicates that the majority of genes have very few connections, whilst certain key regulators, or ‘hubs’, are highly interconnected. Co-expression networks may be used to define regulatory sub- networks and ‘hubs’ that have phenotypic associations. This approach allows all quantitative data to be used and makes no a priori assumptions about relationships in the system and, therefore, can facilitate the exploration of emergent behavior in the system and the generation of novel hypotheses. The ultimate goal of tissue engineering is the replacement of lost or damaged cells, and in vitro, to develop biomimetic (organotypic) structures to serve as experimental models. Tissues, and the strategies to functionally replicate them ex vivo, are complex and require an integrated, multi-disciplinary approach. Systems biology approaches, using data arising from multiple-levels of the biological hierarchy, can facilitate the development of predictive models for bioengineered tissue. The iterative refinement, quantification, and perturbation of these models may expedite the translation of well-validated organotypic systems, through legal regulatory frameworks, into regenerative strategies for musculoskeletal disorders in humans. In this thesis the systems under consideration are the major cell populations of cartilage and tendon (chondrocytes and tenocytes, respectively). They are described in three environmental conditions: native tissue, monolayer (two- dimensional), or three-dimensional models. There has been no systematic investigate of the global gene and protein profiles of cartilage and tendon in their native state relative to monolayer or three-dimensional cultures. There is no clear mechanistic description of the impact of in vitro environmental perturbations on the system or indeed the adequacy of these models as proxies for cartilage and tendon. A discovery approach using transcriptomic and proteomic profiling is undertaken to define a robust and consistent gene and protein profile for each condition. Differentially expressed elements are functionally annotated and pathway topology approaches employed to predict major signalling pathways associated with the observed phenotype. This study defines dedifferentiated chondrocytes and tenocytes in monolayer culture as expressing markers of musculoskeletal development, including scleraxis (Scx) and Mohawk (Mkx). Furthermore, there is reproducible synthetic profile convergence in monolayer culture between cartilage and tendon cells. Standard three-dimensional culture systems for chondrocyte and tenocytes fail to replicate the gene expression profile of cartilage and tendon. The PI-3K/Akt signaling pathway is predicted to be the predominant canonical pathway associated with de- and re-differentiation in vitro. Using novel, and publically available, transcriptomic data sets a meta-analysis of microarray gene expression profiles is performed using weighted gene co- expression network analysis. This is employed for transcriptome network decomposition to isolate highly correlated and interconnected gene-sets (modules) from gene expression profiles of cartilage and tendon cells in different environmental conditions. Sub-networks strongly associated with de- and re- differentiation phenotypes are defined. Comparison of global transcriptome network architecture was performed to define the conservation of network modules between a model species (rat) and human data. In addition to the annotation of an osteoarthritis-associated module in the rat a class-prediction analysis defined a minimal gene signature for the prediction of three-dimensional cultures from standard monolayer culture. Finally, proteomic and transcriptomic data sets are integrated by defining common upstream regulators (TGFB and PDGF BB) and unified mechanistic networks are generated for de- and re- differentiation. The studies collected in this thesis contribute to a wider understanding of cartilage and tendon tissue engineering and organotypic culture development. A clear mechanistic understanding of the regulatory networks controlling differentiation of cartilage and tendon progenitor cells is required in order to develop improved in vitro models and bio-engineered tissue that are physiologically relevant. The findings presented here provide practical outputs and testable hypotheses to drive future evidence-based research in organotypic culture development for musculoskeletal tissues

    Numerical aerodynamic simulation facility feasibility study

    Get PDF
    There were three major issues examined in the feasibility study. First, the ability of the proposed system architecture to support the anticipated workload was evaluated. Second, the throughput of the computational engine (the flow model processor) was studied using real application programs. Third, the availability reliability, and maintainability of the system were modeled. The evaluations were based on the baseline systems. The results show that the implementation of the Numerical Aerodynamic Simulation Facility, in the form considered, would indeed be a feasible project with an acceptable level of risk. The technology required (both hardware and software) either already exists or, in the case of a few parts, is expected to be announced this year. Facets of the work described include the hardware configuration, software, user language, and fault tolerance

     Ocean Remote Sensing with Synthetic Aperture Radar

    Get PDF
    The ocean covers approximately 71% of the Earth’s surface, 90% of the biosphere and contains 97% of Earth’s water. The Synthetic Aperture Radar (SAR) can image the ocean surface in all weather conditions and day or night. SAR remote sensing on ocean and coastal monitoring has become a research hotspot in geoscience and remote sensing. This book—Progress in SAR Oceanography—provides an update of the current state of the science on ocean remote sensing with SAR. Overall, the book presents a variety of marine applications, such as, oceanic surface and internal waves, wind, bathymetry, oil spill, coastline and intertidal zone classification, ship and other man-made objects’ detection, as well as remotely sensed data assimilation. The book is aimed at a wide audience, ranging from graduate students, university teachers and working scientists to policy makers and managers. Efforts have been made to highlight general principles as well as the state-of-the-art technologies in the field of SAR Oceanography

    Separating computation and coordination in the design of parallel and distributed programs

    Get PDF
    The remainder of this thesis is organized as follows. Chapters 2 and 3 introduce the specification formalisms that are used in this thesis. In Chapter 2 we present the computation language. We show that it facilitates the description of specifications that are not partial to a particular mode of execution. Furthermore, we present a semantics and a logic for reasoning about correctness of programs. In Chapter 3 we present the coordination language. We define its semantics and show how it connects to the computation language. In Chapters 4 and 5 we develop a theory of refinement. This theory provides a number of proof techniques that enable us to incrementally refine the behavioural aspects of a program. These chapters form the most theoretical part of this thesis. It should be possible to get an understanding of the methods derived in these chapters without going through all these proofs. In Chapter 7 we illustrate the method of design by considering some case studies. Comparisons with related work and conclusions are described in Chapters 8 and 9.UBL - phd migration 201
    corecore