6,096 research outputs found
Pipelined genetic propagation
© 2015 IEEE.Genetic Algorithms (GAs) are a class of numerical and combinatorial optimisers which are especially useful for solving complex non-linear and non-convex problems. However, the required execution time often limits their application to small-scale or latency-insensitive problems, so techniques to increase the computational efficiency of GAs are needed. FPGA-based acceleration has significant potential for speeding up genetic algorithms, but existing FPGA GAs are limited by the generational approaches inherited from software GAs. Many parts of the generational approach do not map well to hardware, such as the large shared population memory and intrinsic loop-carried dependency. To address this problem, this paper proposes a new hardware-oriented approach to GAs, called Pipelined Genetic Propagation (PGP), which is intrinsically distributed and pipelined. PGP represents a GA solver as a graph of loosely coupled genetic operators, which allows the solution to be scaled to the available resources, and also to dynamically change topology at run-time to explore different solution strategies. Experiments show that pipelined genetic propagation is effective in solving seven different applications. Our PGP design is 5 times faster than a recent FPGA-based GA system, and 90 times faster than a CPU-based GA system
A domain specific approach to high performance heterogeneous computing
Users of heterogeneous computing systems face two problems: first, in understanding the trade-off relationships between the observable characteristics of their applications, such as latency and quality of the result, and second, how to exploit knowledge of these characteristics to allocate work to distributed computing platforms efficiently. A domain specific approach addresses both of these problems. By considering a subset of operations or functions, models of the observable characteristics or domain metrics may be formulated in advance, and populated at run-time for task instances. These metric models can then be used to express the allocation of work as a constrained integer program. These claims are illustrated using the domain of derivatives pricing in computational finance, with the domain metrics of workload latency and pricing accuracy. For a large, varied workload of 128 Black-Scholes and Heston model-based option pricing tasks, running upon a diverse array of 16 Multicore CPUs, GPUs and FPGAs platforms, predictions made by models of both the makespan and accuracy are generally within 10 percent of the run-time performance. When these models are used as inputs to machine learning and MILP-based workload allocation approaches, a latency improvement of up to 24 and 270 times over the heuristic approach is seen
Programming Model to Develop Supercomputer Combinatorial Solvers
© 2017 IEEE. Novel architectures for massively parallel machines offer better scalability and the prospect of achieving linear speedup for sizable problems in many domains. The development of suitable programming models and accompanying software tools for these architectures remains one of the biggest challenges towards exploiting their full potential. We present a multi-layer software abstraction model to develop combinatorial solvers on massively-parallel machines with regular topologies. The model enables different challenges in the design and optimization of combinatorial solvers to be tackled independently (separation of concerns) while permitting problem-specific tuning and cross-layer optimization. In specific, the model decouples the issues of inter-node communication, n ode-level scheduling, problem mapping, mesh-level load balancing and expressing problem logic. We present an implementation of the model and use it to profile a Boolean satisfiability solver on simulated massively-parallel machines with different scales and topologies
Pattern scaling using ClimGen: monthly-resolution future climate scenarios including changes in the variability of precipitation
Development, testing and example applications of the pattern-scaling approach for generating future climate change projections are reported here, with a focus on a particular software application called “ClimGen”. A number of innovations have been implemented, including using exponential and logistic functions of global-mean temperature to represent changes in local precipitation and cloud cover, and interpolation from climate model grids to a finer grid while taking into account land-sea contrasts in the climate change patterns. Of particular significance is a new approach for incorporating changes in the inter-annual variability of monthly precipitation simulated by climate models. This is achieved by diagnosing simulated changes in the shape of the gamma distribution of monthly precipitation totals, applying the pattern-scaling approach to estimate changes in the shape parameter under a future scenario, and then perturbing sequences of observed precipitation anomalies so that their distribution changes according to the projected change in the shape parameter. The approach cannot represent changes to the structure of climate timeseries (e.g. changed autocorrelation or teleconnection patterns) were they to occur, but is shown here to be more successful at representing changes in low precipitation extremes than previous pattern-scaling methods
Add-on LABA in a separate inhaler as asthma step-up therapy versus increased dose of ICS or ICS/LABA combination inhaler.
Asthma management guidelines recommend adding a long-acting β2-agonist (LABA) or increasing the dose of inhaled corticosteroid (ICS) as step-up therapy for patients with uncontrolled asthma on ICS monotherapy. However, it is uncertain which option works best, which ICS particle size is most effective, and whether LABA should be administered by separate or combination inhalers. This historical, matched cohort study compared asthma-related outcomes for patients (aged 12-80 years) prescribed step-up therapy as a ≥50% extrafine ICS dose increase or add-on LABA, via either a separate inhaler or a fine-particle ICS/LABA fixed-dose combination (FDC) inhaler. Risk-domain asthma control was the primary end-point in comparisons of cohorts matched for asthma severity and control during the baseline year. After 1:2 cohort matching, the increased extrafine ICS versus separate ICS+LABA cohorts included 3232 and 6464 patients, respectively, and the fine-particle ICS/LABA FDC versus separate ICS+LABA cohorts included 7529 and 15 058 patients, respectively (overall mean age 42 years; 61-62% females). Over one outcome year, adjusted OR (95% CI) for achieving asthma control were 1.25 (1.13-1.38) for increased ICS versus separate ICS+LABA and 1.06 (1.05-1.09) for ICS/LABA FDC versus separate ICS+LABA. For patients with asthma, increased dose of extrafine-particle ICS, or add-on LABA via ICS/LABA combination inhaler, is associated with significantly better outcomes than ICS+LABA via separate inhalers.Research in Real-Life ltd.
Teva Pharmaceutical Industries
Cost-effectiveness of HBV and HCV screening strategies:a systematic review of existing modelling techniques
Introduction:
Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches.
Methods:
A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions.
Results:
The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology.
Conclusion:
When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers
Formalization of Transform Methods using HOL Light
Transform methods, like Laplace and Fourier, are frequently used for
analyzing the dynamical behaviour of engineering and physical systems, based on
their transfer function, and frequency response or the solutions of their
corresponding differential equations. In this paper, we present an ongoing
project, which focuses on the higher-order logic formalization of transform
methods using HOL Light theorem prover. In particular, we present the
motivation of the formalization, which is followed by the related work. Next,
we present the task completed so far while highlighting some of the chal