275,029 research outputs found
Incremental Analysis of Programs
Algorithms used to determine the control and data flow properties of computer programs are generally designed for one-time analysis of an entire new input. Application of such algorithms when the input is only slightly modified results in an inefficient system. In this theses a set of incremental update algorithms are presented for data flow analysis. These algorithms update the solution from a previous analysis to reflect changes in the program. Thus, extensive reanalysis to reflect changes in the program. Thus, extensive reanalysis of programs after each program modification can be avoided. The incremental update algorithms presented for global flow analysis are based on Hecht/Ullman iterative algorithms. Banning\u27s interprocedural data flow analysis algorithms form the basis for the incremental interprocedural algorithms
Cost-effectiveness of left ventricular assist devices (LVADs) for patients with advanced heart failure : analysis of the British NHS Bridge to Transplant (BTT) program
Background: A previous cost-effectiveness analysis showed that bridge to transplant (BTT) with early design left ventricular assist devices (LVADs) for advanced heart failure was more expensive than medical management while appearing less beneficial.
Older LVADs were pulsatile, but current second and third generation LVADs are continuous flow pumps. This study aimed to estimate comparative cost-effectiveness of BTT with durable implantable continuous flow LVADs compared to medical management in the British NHS.
Methods and results: A semi-Markov multi-state economic model was built using NHS costs data and patient data in the British NHS Blood and Transplant Database (BTDB). Quality-adjusted life years (QALYs) and incremental costs per QALY were calculated for patients receiving LVADs compared to those receiving inotrope supported medical management. LVADs cost £80,569 (84,963)/QALY (95%CI: £31,802–£94,853; 150,560) (over a lifetime horizon). Estimates were sensitive to choice of comparator population, relative likelihood of receiving a heart transplant, time to transplant, and LVAD costs. Reducing the device cost by 15% decreased the ICER to £50,106 ($79,533)/QALY.
Conclusions: Durable implantable continuous flow LVADs deliver greater benefits at higher costs than medical management in Britain. At the current UK threshold of £20,000 to £30,000/QALY LVADs are not cost effective but the ICER now begins to approach that of an intervention for end of life care recently recommended by the British NHS. Cost-effectiveness estimates are hampered by the lack of randomized trials
Cost-effectiveness of left ventricular assist devices (LVADs) for patients with advanced heart failure : analysis of the British NHS Bridge to Transplant (BTT) program
Background: A previous cost-effectiveness analysis showed that bridge to transplant (BTT) with early design left ventricular assist devices (LVADs) for advanced heart failure was more expensive than medical management while appearing less beneficial.
Older LVADs were pulsatile, but current second and third generation LVADs are continuous flow pumps. This study aimed to estimate comparative cost-effectiveness of BTT with durable implantable continuous flow LVADs compared to medical management in the British NHS.
Methods and results: A semi-Markov multi-state economic model was built using NHS costs data and patient data in the British NHS Blood and Transplant Database (BTDB). Quality-adjusted life years (QALYs) and incremental costs per QALY were calculated for patients receiving LVADs compared to those receiving inotrope supported medical management. LVADs cost £80,569 (84,963)/QALY (95%CI: £31,802–£94,853; 150,560) (over a lifetime horizon). Estimates were sensitive to choice of comparator population, relative likelihood of receiving a heart transplant, time to transplant, and LVAD costs. Reducing the device cost by 15% decreased the ICER to £50,106 ($79,533)/QALY.
Conclusions: Durable implantable continuous flow LVADs deliver greater benefits at higher costs than medical management in Britain. At the current UK threshold of £20,000 to £30,000/QALY LVADs are not cost effective but the ICER now begins to approach that of an intervention for end of life care recently recommended by the British NHS. Cost-effectiveness estimates are hampered by the lack of randomized trials
Recommended from our members
Secondary Natural Gas Recovery: Reservoir Heterogeneity and Potential for Reserve Growth through Infield Drilling: An Example from McAllen Ranch Field, Hidalgo County, Texas
Integrated engineering, geological, geophysical, and petrophysical analyses of McAllen Ranch field have delineated several controls on secondary recovery of natural gas. Barriers to the flow of natural gas within laterally continuous lower Vicksburg sandstone reservoirs can be demonstrated through finite-element modeling. These barriers are probably diagenetic in origin. In the B area of McAllen Ranch field, faults are unlikely to be the primary barriers to gas flow because faults were not inferred from analysis of high-quality three-dimensional seismic images between the key wells used in this study (Hill and others, 1991). Barriers result in incremental reserve additions when some reservoir domains contain no well completions. Areas containing potential incremental gas resources, identified through this analysis, were confirmed by subsequent recompletions in 1991. Three recompletions proposed by this project have proved successful. Our analysis of public domain production data indicates that new infield wells in the Vicksburg S reservoir have increased reserves 69 percent above an estimate made from analysis of 1980 public domain data. Additionally, more than 100 barrels per day of reserves has been added through new wells drilled between 1988 and 1991. Most of the McAllen Ranch Vicksburg S reserve increases are due to a geological reinterpretation that has stimulated infield step-out development of the Vicksburg S reservoir. Distributary-channel-fill sandstones are the most likely candidates to contain incremental reserves because they are laterally discontinuous and are predominant in areas where numerous reservoir sandstones are stacked.Bureau of Economic Geolog
Data Flow Analysis and the Linear Programming Model
* The research is supported partly by INTAS: 04-77-7173 project, http://www.intas.beThe general discussion of the data flow algorithmic models, and the linear programming problem with
the variating by data flow criterion function coefficients are presented. The general problem is widely known in
different names - data streams, incremental and online algorithms, etc. The more studied algorithmic models
include mathematical statistics and clustering, histograms and wavelets, sorting, set cover, and others. Linear
programming model is an addition to this list. Large theoretical knowledge exists in this as the simplex algorithm
and as interior point methods but the flow analysis requires another interpretation of optimal plans and plan
transition with variate coefficients. An approximate model is devised which predicts the boundary stability point for
the current optimal plan. This is valuable preparatory information of applications, moreover when a parallel
computational facility is supposed
Incremental Static Analysis of Probabilistic Programs
Probabilistic models are used successfully in a wide range of fields, including machine
learning, data mining, pattern recognition, and robotics. Probabilistic programming
languages are designed to express probabilistic models in high-level programming
languages and to conduct automatic inference to compute posterior distributions.
A key obstacle to the wider adoption of probabilistic programming languages
in practice is that general-purpose efficient inference is computationally difficult.
This thesis aims to improve the efficiency of inference through incremental analysis,
while preserving precision when a probabilistic program undergoes small changes.
For small changes to probabilistic knowledge (i.e., prior probability distributions
and observations), the probabilistic model represented by a probabilistic
program evolves. In this thesis, we first present a new approach, Icpp, which
is a data-flow-based incremental inference approach. By capturing the probabilistic
dependence of each data-flow fact and updating changed probabilities
sparsely, Icpp can incrementally compute new posterior distributions and
thus enable previously computed results to be reused.
For small changes at observed array data, upon which their probabilistic models
are conditioned, the probabilistic models remain unchanged. In this thesis, we
also present ISymb, which is a novel incremental symbolic inference framework.
By conducting an intra-procedurally path-sensitive analysis, except for "meets-over-all-paths" analysis within an iteration of a loop (conditioned on some
observed array data), ISymb captures the probability distribution for each
path and only recomputes the probability distributions for the affected paths.
Further, ISymb enables a precision-preserving incremental symbolic inference
to run significantly faster than its non-incremental counterparts.
In this thesis, we evaluate both Icpp and ISymb against the state-of-the-art
data-flow-based inference and symbolic inference, respectively. The results demonstrate
that both Icpp and ISymb meet their design goals. For example, Icpp
succeeds in making data-flow-based incremental inference possible in probabilistic
programs when some probabilistic knowledge undergoes small yet frequent changes.
Additionally, ISymb enables symbolic inference to perform one or two orders of
magnitude faster than non-incremental inference when some observed array dat
Time-Lapse Monitoring of Two-Dimensional Non-Uniform Unsaturated Flow Processes Using Ground Penetrating Radar
Unsaturated flow in the vadose zone often manifests as preferential flow resulting in transport of water and solutes through the soil much faster than would occur for uniform matrix flow. Time-lapse ground penetrating radar (TLGPR) shows promise as a non-invasive means to monitor unsaturated flow and here is used to monitor lab-scale forced infiltration events for capturing evidence of non-uniform and preferential flow phenomena directly from arrivals in the GPR images while simultaneously characterizing parameters of the flow system, such as bulk water content and rates of wetting front movement. This was accomplished by 1) directly interpreting transient arrivals in GPR profiles for evidence of ono-uniform flow and 2) with the aid of migration processing techniques to improve the quality of GPR images for identification and tracking of transient arrivals related to wetting in the soil. A novel method is described and evaluated to characterize the 2D velocity structure of a soil and used to migrate the GPR images. This method incorporates multi-offset measurements to characterize the depth to a potentially unknown static reflector and root mean square (RMS) velocity above the reflector with incremental changes in travel time to the static reflector and a transient reflector (i.e. the wetting front) determined from single-offset constant offset profiles to determine incremental changes in velocity above and below the transient arrival. The method is applied to TLGPR data during infiltration experiments in a 60 cm deep sand-filled tank and monitored with water content probes. To verify the approach the methodology is applied to GPR data simulated using transient water contents generated by the unsaturated flow simulator HYDRUS 2D given lab-measured hydraulic properties of the soil. For both the empirical and simulated data, we found that the 2D velocity analysis was effective in monitoring changes in the wetting front and that migration of the reflection profiles was able to improve the interpretation of non-uniform flow
MEMORANDUM: Calibration of Great Bay Estuary Hydrodynamic Model and Incremental Nitrogen Estimation
This technical memorandum summarizes the completion of the calibration of a hydrodynamic model of the Great Bay Estuary System (GBES) originally started as part of the Squamscott River modeling study. The Squamscott River modeling study was discontinued when it was realized that excessive levels of algae in the Exeter wastewater lagoons discharge had a significant effect on Squamscott River water quality. Because Exeter plans to upgrade its wastewater treatment system and eliminate excessive algal levels in its effluent discharge, it was decided not to develop a hydrodynamic water quality model with Squamscott River water quality data that is so atypical and different than expected future river water quality after the Exeter wastewater treatment system upgrade. However, it was recognized that the completion of the hydrodynamic model of the GBES would provide a useful tool for the cities of Dover, Rochester, and Portsmouth to relate present and future wastewater effluent nitrogen discharges to increases in GBES nitrogen levels. The following is a brief description of the hydrodynamic model framework and calibration analysis against salinity, temperature, and tidal elevation measurements at various locations throughout the GBES. Later sections in this document summarize the application of the GBES calibrated hydrodynamic model in computing incremental nitrogen levels in the Estuary as a result of multiple effluent nitrogen scenarios
- …