419 research outputs found

    Married Coal Gasification and Biomass Pyrolysis

    Get PDF
    This project is a proposal to marry a coal gasification process to a biomass pyrolysis. Coal is pyrolized to produce syngas and a large amount of heat. The syngas is treated and fed to a Fischer-Tropsch process. The excess heat produces steam that is used to pyrolize biomass. The biomass produces char, gas, and vapor. The char and gas are recovered, and the vapor is condensed to produce bio-oil. The proposed plant has a capacity of 1100 tons of biomass (550 dry tons) per day. I assumed an operating factor of 0.9 The plant could be operational within five years, with approximately three years of construction and a year operating at lowered capacity. However, significant research is needed to confirm and optimize certain aspects of the process. The process has some sections at high temperatures and moderately high pressures. Certain units will be constructed of special materials to withstand the temperatures. Safety features will also be installed which will prevent temperatures from elevating beyond normal operating levels. A strong basic solution is also used to strip CO2 from process gases. These units will be constructed of stainless steel for structural integrity. Finally, highly combustible products are used in the process. Safe storage of these materials and strict fire safety provisions will be established to minimize the risks of fire or explosion. Safety considerations affected the capital cost of the plant because of the special materials required for safe construction. The process does not have many environmental concerns. The gases produced by gasification and pyrolysis are collected as a byproduct to be used in a Fischer-Tropsch process. The disposal of liquid and solid waste is more of a concern. Solid slag can be disposed of in a landfill. Some liquid waste contains slag and ash and can be filtered. Other liquid waste has bio-oil contamination which is more expensive to remove. I made the assumption that this waste stream would need secondary treatment, which is fairly expensive and is a large contribution to the overall costs. The total capital cost of the plant is around 30million.Forachemicalplant,thisisnotverylarge.Thecapitalcosthasaveryminoreffectontheproductioncosts.Toruntheequipment,theplantwillemploy45processengineers,with5shiftsof9operators.Bioāˆ’oilcanbeproducedfor30 million. For a chemical plant, this is not very large. The capital cost has a very minor effect on the production costs. To run the equipment, the plant will employ 45 process engineers, with 5 shifts of 9 operators. Bio-oil can be produced for 1.77 per kilogram. Estimates for bio-oil produced from an unmarried process are hard to find, and have not been scaled to a current value. They range from 0.09to0.09 to 0.50 per kilogram. The large disparity between my price and the literature values is mostly due to the high cost of disposing water contaminated with bio-oil. I recommend proceeding with research on ways to lower the operating costs. Specifically, if water contaminated with bio-oil can be disposed of with primary treatment rather than secondary treatment, the cost of production is lowered by 1.47perkilogram,to1.47 per kilogram, to 0.30 per kilogram. Lowering waste treatment costs will make the plant highly competitive with unmarried designs. Further research should also be conducted into other process parameters, such as the effectiveness of NaOH stripping, a novel technique I used to lower treatment costs

    Efficient Uncertainty Quantification Applied to the Aeroelastic Analysis of a Transonic Wing

    Get PDF
    The application of a Point-Collocation Non-Intrusive Polynomial Chaos method to the uncertainty quantification of a stochastic transonic aeroelastic wing problem has been demonstrated. The variation in the transient response of the first aeroelastic mode of a three-dimensional wing in transonic flow due to the uncertainty in free-stream Mach number and angle of attack was studied. A curve-fitting procedure was used to obtain time-independent parameterization of the transient aeroelastic responses. Among the uncertain parameters that characterize the time-dependent transients, the damping factor was chosen for uncertainty quantification, since this parameter can be thought as an indicator for flutter. Along with the mean and the standard deviation of the damping factor, the probability of having flutter for the given uncertainty in the Mach number and the angle of attack has been also calculated. Besides the Point-Collocation Non-Intrusive Polynomial Chaos method, 1000 Latin Hypercube Monte Carlo simulations were also performed to quantify the uncertainty in the damping factor. The results obtained for various statistics of the damping factor including the flutter probability showed that an 8th degree Point-Collocation Non-Intrusive Polynomial Chaos expansion is capable of estimating the statistics at an accuracy level of 1000 Latin Hypercube Monte Carlo simulation with a significantly lower computational cost. In addition to the uncertainty quantification, the response surface approximation, sensitivity analysis, and reconstruction of the transient response via Non-Intrusive Polynomial Chaos were also demonstrated

    Efficient Sampling for Non-Intrusive Polynomial Chaos Applications with Multiple Uncertain Input Variables

    Get PDF
    The accuracy and the computational efficiency of a Point-Collocation Non-Intrusive Polynomial Chaos (NIPC) method applied to stochastic problems with multiple uncertain input variables has been investigated. Two stochastic model problems with multiple uniform random variables were studied to determine the effect of different sampling methods (Random, Latin Hypercube, and Hammersley) for the selection of the collocation points. The effect of the number of collocation points on the accuracy of polynomial chaos expansions were also investigated. The results of the stochastic model problems show that all three sampling methods exhibit a similar performance in terms of the the accuracy and the computational efficiency of the chaos expansions. It has been observed that using a number of collocation points that is twice more than the minimum number required gives a better approximation to the statistics at each polynomial degree. This improvement can be related to the increase of the accuracy of the polynomial coefficients due to the use of more information in their calculation. The results of the stochastic model problems also indicate that for problems with multiple random variables, improving the accuracy of polynomial chaos coefficients in NIPC approaches may reduce the computational expense by achieving the same accuracy level with a lower order polynomial expansion. To demonstrate the application of Point-Collocation NIPC to an aerospace problem with multiple uncertain input variables, a stochastic computational aerodynamics problem which includes the numerical simulation of steady, inviscid, transonic flow over a three-dimensional wing with an uncertain free-stream Mach number and angle of attack has been studied. For this study, a 5th degree Point-Collocation NIPC expansion obtained with Hammersley sampling was capable of estimating the statistics at an accuracy level of 1000 Latin Hypercube Monte Carlo simulations with a significantly lower computational cost

    Coverage Probability Fails to Ensure Reliable Inference

    Get PDF
    Satellite conjunction analysis is the assessment of collision risk during a close encounter between a satellite and another object in orbit. A counterintuitive phenomenon has emerged in the conjunction analysis literature, namely, probability dilution, in which lower quality data paradoxically appear to reduce the risk of collision. We show that probability dilution is a symptom of a fundamental deficiency in probabilistic representations of statistical inference, in which there are propositions that will consistently be assigned a high degree of belief, regardless of whether or not they are true. We call this deficiency false confidence. In satellite conjunction analysis, it results in a severe and persistent underestimate of collision risk exposure. We introduce the Martin--Liu validity criterion as a benchmark by which to identify statistical methods that are free from false confidence. Such inferences will necessarily be non-probabilistic. In satellite conjunction analysis, we show that uncertainty ellipsoids satisfy the validity criterion. Performing collision avoidance maneuvers based on ellipsoid overlap will ensure that collision risk is capped at the user-specified level. Further, this investigation into satellite conjunction analysis provides a template for recognizing and resolving false confidence issues as they occur in other problems of statistical inference.Comment: 18 pages, 3 figure

    Forced to Conform? Using Common Processes and Standards to Create Effective eLearning

    Get PDF
    Abstract: Working on multiple large-scale eLearning projects forces teams to try and standardise processes and procedures. Tools such as XML allow us to manipulate and exploit content in ways previously impossible. However, no academic from any discipline likes to imagine that their content is standard. And terms such as 'reuse' and 'repurposing' make academics even less comfortable. And perhaps they are right. This article describes a formalised development methodology created by one eLearning development team based at the University of Oxford, designed as a generic system flexible enough to cope with a wide range of subjects and audiences. This paper will also set this development process in the broader world of academic eLearning development across the disciplines, looking especially at the role of standards to consider future directions and the applicability of any development methodology to wider learning development contexts.Editors: Stuart Lee

    Prognostic Value of [18F]-Fluoro-Deoxy-Glucose PET/CT, S100 or MIA for Assessment of Cancer-Associated Mortality in Patients with High Risk Melanoma

    Get PDF
    PURPOSE: To assess the prognostic value of FDG PET/CT compared to the tumor markers S100B and melanoma inhibitory activity (MIA) in patients with high risk melanoma. METHODS: Retrospective study in 125 consecutive patients with high risk melanoma that underwent FDG PET/CT for re-staging. Diagnostic accuracy and prognostic value was determined for FDG PET/CT as well as for S100B and MIA. As standard of reference, cytological, histological, PET/CT or MRI follow-up findings as well as clinical follow-up were used. RESULTS: Of 125 patients, FDG PET/CT was positive in 62 patients. 37 (29.6%) patients had elevated S100B (>100 pg/ml) and 24 (20.2%) had elevated MIA (>10 pg/ml) values. Overall specificities for FDG PET/CT, S100B and MIA were 96.8% (95% CI, 89.1% to 99.1%), 85.7% (75.0% to 92.3%), and 95.2% (86.9% to 98.4%), corresponding sensitivities were 96.8% (89.0% to 99.1%), 45.2% (33.4% to 55.5%), and 36.1% (25.2% to 48.6%), respectively. The negative predictive values (NPV) for PET/CT, S100B, and MIA were 96.8% (89.1% to 99.1%), 61.4% (50.9% to 70.9%), and 60.6% (50.8% to 69.7%). The positive predictive values (PPV) were 96.7% (89.0% to 99.1%), 75.7% (59.9% to 86.6%), and 88.0% (70.0% to 95.8%). Patients with elevated S100B- or MIA values or PET/CT positive findings showed a significantly (p<0.001 each, univariate Cox regression models) higher risk of melanoma associated death which was increased 4.2-, 6.5- or 17.2-fold, respectively. CONCLUSION: PET/CT has a higher prognostic power in the assessment of cancer-associated mortality in melanoma patients compared with S100 and MIA

    Functional Amyloid Formation within Mammalian Tissue

    Get PDF
    Amyloid is a generally insoluble, fibrous cross-Ī² sheet protein aggregate. The process of amyloidogenesis is associated with a variety of neurodegenerative diseases including Alzheimer, Parkinson, and Huntington disease. We report the discovery of an unprecedented functional mammalian amyloid structure generated by the protein Pmel17. This discovery demonstrates that amyloid is a fundamental nonpathological protein fold utilized by organisms from bacteria to humans. We have found that Pmel17 amyloid templates and accelerates the covalent polymerization of reactive small molecules into melaninā€”a critically important biopolymer that protects against a broad range of cytotoxic insults including UV and oxidative damage. Pmel17 amyloid also appears to play a role in mitigating the toxicity associated with melanin formation by sequestering and minimizing diffusion of highly reactive, toxic melanin precursors out of the melanosome. Intracellular Pmel17 amyloidogenesis is carefully orchestrated by the secretory pathway, utilizing membrane sequestration and proteolytic steps to protect the cell from amyloid and amyloidogenic intermediates that can be toxic. While functional and pathological amyloid share similar structural features, critical differences in packaging and kinetics of assembly enable the usage of Pmel17 amyloid for normal function. The discovery of native Pmel17 amyloid in mammals provides key insight into the molecular basis of both melanin formation and amyloid pathology, and demonstrates that native amyloid (amyloidin) may be an ancient, evolutionarily conserved protein quaternary structure underpinning diverse pathways contributing to normal cell and tissue physiology
    • ā€¦
    corecore