6 research outputs found

    Development of computer model and expert system for pneumatic fracturing of geologic formations

    Get PDF
    The objective of this study was the development of a new computer program called PF-Model to analyze pneumatic fracturing of geologic formations. Pneumatic fracturing is an in situ remediation process that involves injecting high pressure gas into soil or rock matrices to enhance permeability, as well as to introduce liquid and solid amendments. PF-Model has two principal components: (1) Site Screening, which heuristically evaluates sites with regard to process applicability; and (2) System Design, which uses the numerical solution of a coupled algorithm to generate preliminary design parameters. Designed as an expert system, the Site Screening component is a high performance computer program capable of simulating human expertise within a narrow domain. The reasoning process is controlled by the inference engine, which uses subjective probability theory (based on Bayes\u27 theorem) to handle uncertainty. The expert system also contains an extensive knowledge base of geotechnical data related to field performance of pneumatic fracturing. The hierarchical order of importance established for the geotechnical properties was formation type, depth, consistency/relative density, plasticity, fracture frequency, weathering, and depth of water table. The expert system was validated by a panel of five experts who rated selected sites on the applicability of the three main variants of pneumatic fracturing. Overall, PF-Model demonstrated better than an 80% agreement with the expert panel. The System Design component was programmed with structured algorithms to accomplish two main functions: (1) to estimate fracture aperture and radius (Fracture Prediction Mode); and (2) to calibrate post-fracture Young\u27s modulus and pneumatic conductivity (Calibration Mode). The Fracture Prediction Mode uses numerical analysis to converge on a solution by considering the three coupled physical processes that affect fracture propagation: pressure distribution, leakoff, and deflection. The Calibration Mode regresses modulus using a modified deflection equation, and then converges on the conductivity in a method similar to the Fracture Prediction Mode. The System Design component was validated and calibrated for each of the 14 different geologic formation types supported by the program. Validation was done by comparing the results of PF-Model to the original mathematical model. For the calibration process, default values for flow rate, density, Poisson\u27s ratio, modulus, and pneumatic conductivity were established by regression until the model simulated, in general, actual site behavior. PF-Model was programmed in Visual Basic 5.0 and features a menu driven GUI. Three extensive default libraries are provided: probabilistic knowledge base, flownet shape factors, and geotechnical defaults. Users can conveniently access and modify the default libraries to reflect evolving trends and knowledge. Recommendations for future study are included in the work

    The Geometric Mean as a Generator of Truth-Value in Heuristic Expert Systems: An Improvement over the Fuzzy Weighted Arithmetic Mean

    Get PDF
    Many earlier expert systems that were modeled after MYCIN, the first expert system, employed truth-value factors for their rule antecedents (premises) and consequents (conclusions). These crisp truth-value factors were usually called certainty factors and attempted to provide a measure of confidence and computational capability to the analysis of rule uncertainty (Shortliffe, 1977; Kandel, 1994). However, in the literature criticism has been often expressed concerning the lack of precision a crisp truth/certainty factor value conveys (Zadeh, 1983; Turban, 1993). Zadeh (1973) and Xingui (1988) utilized the weighted fuzzy average algorithm to improve the precision of truth/certainty factor values. Kandel (1994) further extended the fuzzy weighted mean concept introducing rule confidence, priority, and conclusion weighting factors. Later, Chen (1996) further modified the fuzzy weighted mean algorithm through the factoring of independent rule premise and consequent weights, truth-values and certainty factors. All of these progressive variants of the fuzzy weighted mean enhanced perceived rule antecedent and consequent truth-value. This research investigated a modification of the fuzzy weighted algorithms of Chen and Kandel utilized in assessing heuristic expert system rule truth-value. Their algorithms were modified to demonstrate that a more statistically precise rule truth-value can be achieved by utilizing the geometric mean to aggregate rule truth-value components

    Informed Clinical Management of Acute Stroke: Use of Established Statistical Methods and Development of an Expert System

    Get PDF
    This thesis applies several statistical techniques which aim to provide informed clinical management in acute stroke. An introduction is given to issues arising in stroke management and expert systems methodology. Three linear discriminant scoring systems (the Allen, Siriraj and Besson scores) intended for the differential diagnosis between ischaemic and haemorrhagic stroke on the basis of clinical presentation are evaluated in chapter 2. Chapter 3 explores whether angiotensin converting enzyme DD genotype is a risk factor for acute stroke or influences stroke outcome as measured by lesion size. Chapters 4 and 5 assess computed tomography, mean cerebral transit time and single-photon emission computed tomography scanning in terms of their accuracy in predicting functional outcome after acute ischaemic stroke. Chapter 6 broadens the search for prognostic factors, looking at the performance of the Guy's prognostic score and established neurological scales (Canadian neurological scale, National Institutes of Health stroke scale, middle cerebral artery neurological scale) in predicting acute stroke outcome. A linear discriminant score, based on simple clinical measurements recorded in the acute stroke unit, is also developed. Chapter 7 looks specifically at the influence of plasma glucose level on survival following acute stroke, after adjusting for other known prognostic factors using Cox's proportional hazards regression model. The remainder of the thesis is concerned with two aspects of acute stroke management. The first of these is the selection of an appropriate clinical trial for an individual patient. A computer program is developed to obtain, in an efficient manner, the information required to check the entry and exclusion criteria for each available clinical trial. The second aspect of stroke management considered is the choice of a suitable method for secondary prevention of stroke in individual acute ischaemic stroke patients. Candidate methods are long-term anticoagulation with warfarin, or aspirin antiplatelet therapy. Expert system methodology is used to combine positive indications for, and contraindications to each of these therapies with clinical data available in the acute stroke unit. The annual risks of recurrent ischaemic stroke, haemorrhagic stroke, myocardial infarction, other ischaemic complications and other haemorrhagic complications are estimated to allow an informed decision on the appropriate method of secondary prevention to be made
    corecore