92,965 research outputs found

    CFD modeling of multiphase fluidized Bed

    Get PDF
    CFD is predicting what will happen, quantitatively, when fluids flow, often with the complications of, simultaneous flow of heat, mass transfer (eg perspiration, dissolution), phase change (eg melting, freezing, boiling), chemical reaction (eg combustion, rusting), mechanical movement (eg of pistons, fans, rudders), stresses in and displacement of immersed or surrounding solids. Knowing how fluids will flow, and what will be their quantitative effects on the solids with which they are in contact, assists chemical engineers to maximize the yields from their reactors and processing equipment at least cost; risk. CFD uses a computer to solve the relevant science-based mathematical equations, using information about the circumstances in question. Its components are therefore: the human being who states the problem, scientific knowledge expressed mathematically, the computer code (ie software) which embodies this knowledge and expresses the stated problem in scientific terms, the computer hardware which performs the calculations dictated by the software. Our project involves determining the validity of predictions made by CFD software (FLUENT) on three phase fluidization in a cylindrical bed by comparing with the practical results from the experiment conducted in lab. By this we were successfully able to predict the relationship of pressure drop and bed height vs. superficial velocity for different bed materials and liquid of different viscosities. For chemical processes where mass transfer is the rate limiting step, it is important to be able to estimate the gas holdup as this relates directly to the mass transfer. Although gas hold up in three-phase fluidized bed has received significant attention, most previous work has utilized air, water and small beads as gas, liquid and solid respectively. The gas hold up in such systems is often considerably lower than for pilotplant or industrial-scale units. In our project we have used glycerol of different concentrations to be able to maximize the usefulness of the result

    Process control and data handling in clinical chemistry by a laboratory computer

    Get PDF
    The thesis describes the development and assessment of a clinical chemistry computer system based on the Elliott 903C computer obtained for the on-line monitoring of automated equipment and the subsequent processing of the data derived. The special hardware required for interfacing the automated equipment with the computer was designed and constructed by Elliott Medical Automation Limited. All the software required for the operation of the system was written by the manufacturer's programming staff and my part was to be closely involved with general systems analysis. A detailed account is given of the evaluation of all the parameters required for the on-line monitoring of AutoAnalyzers and the provision of information required for calculation routines, checking quality control results, defining ranges for the automatic flagging of abnormal results, etc. The development work, including the testing, proving and where necessary, the modification of programs, was carried out in the Royal Infirmary, Edinburgh, with the assistance of the technical staff of the laboratory. In the initial stages of development the computer system was run in parallel with the existing laboratory equipment to enable a full assessment of the system to be carried out. This included assessing the performance of process control functions and the chemical acceptability of the system. At a later stage an assessment was made of the routine operation of the computer system when interest was focused on the time taken to perform individual tasks and the reliability of the hardware components. With the exception of one aspect of peak detection, the data acquisition programs were found to operate in a satisfactory manner, and the accuracy and precision of the computer system was at least as good as that of the routine laboratory methods; these latter involved manual reading and interpretation of recorder charts. The individual data processing programs were validated but when the programs were integrated to form a total software system, considerable delays in processing were encountered. Despite several attempts to reduce the time taken to perform processing routines, it was found impracticable to carry out the data handling activities of the laboratory within an acceptable time scale using the existing hardware configuration. The computer system is currently in use on a seven-day week basis for monitoring analytical equipment and performing the following functions (1) Acquisition of raw data from as many as 19 different determinations on up to 12 AutoAnalyzer channels at one time. (2) Peak detection and validation. (3) Calculation of results after correction for instrumental drift. (4) Output of results identified by cup number. (5) Calculation of mean and standard deviation of patient specimens. The present mode of operation removes the need for manual reading of AutoAnalyzer charts and hence reading errors, but it involves the transcription of results from the computer print-out to manually prepared work sheets, and the further transcription of results from work sheets to patient reports. The benefits derived from the Elliott 903 computer in its present form of operation can be summarised as follows: (1) It has been possible to increase the laboratory throughput without a substantial increase in staff in spite of an increase in the numbers of technical staff attending classes of further education during working hours. This has resulted in an increase in productivity and a decrease in the average cost per determination. (2) There is a decrease in the number of human errors by the elimination of reading of recorder charts. (3) Quality control statistics are available while they are still relevant to the current situation. The extension or modification of the hardware configuration and the additional software required to meet the needs of this laboratory have been investigated. Consideration has been given to the possibility of completely replacing the present computer system and to the feasibility of linking the laboratory system to a remotely situated data processing computer system

    DeSyRe: on-Demand System Reliability

    No full text
    The DeSyRe project builds on-demand adaptive and reliable Systems-on-Chips (SoCs). As fabrication technology scales down, chips are becoming less reliable, thereby incurring increased power and performance costs for fault tolerance. To make matters worse, power density is becoming a significant limiting factor in SoC design, in general. In the face of such changes in the technological landscape, current solutions for fault tolerance are expected to introduce excessive overheads in future systems. Moreover, attempting to design and manufacture a totally defect and fault-free system, would impact heavily, even prohibitively, the design, manufacturing, and testing costs, as well as the system performance and power consumption. In this context, DeSyRe delivers a new generation of systems that are reliable by design at well-balanced power, performance, and design costs. In our attempt to reduce the overheads of fault-tolerance, only a small fraction of the chip is built to be fault-free. This fault-free part is then employed to manage the remaining fault-prone resources of the SoC. The DeSyRe framework is applied to two medical systems with high safety requirements (measured using the IEC 61508 functional safety standard) and tight power and performance constraints

    Editorial Special Issue on Enhancement Algorithms, Methodologies and Technology for Spectral Sensing

    Get PDF
    The paper is an editorial issue on enhancement algorithms, methodologies and technology for spectral sensing and serves as a valuable and useful reference for researchers and technologists interested in the evolving state-of-the-art and/or the emerging science and technology base associated with spectral-based sensing and monitoring problem. This issue is particularly relevant to those seeking new and improved solutions for detecting chemical, biological, radiological and explosive threats on the land, sea, and in the air

    The Algorithmic Origins of Life

    Full text link
    Although it has been notoriously difficult to pin down precisely what it is that makes life so distinctive and remarkable, there is general agreement that its informational aspect is one key property, perhaps the key property. The unique informational narrative of living systems suggests that life may be characterized by context-dependent causal influences, and in particular, that top-down (or downward) causation -- where higher-levels influence and constrain the dynamics of lower-levels in organizational hierarchies -- may be a major contributor to the hierarchal structure of living systems. Here we propose that the origin of life may correspond to a physical transition associated with a shift in causal structure, where information gains direct, and context-dependent causal efficacy over the matter it is instantiated in. Such a transition may be akin to more traditional physical transitions (e.g. thermodynamic phase transitions), with the crucial distinction that determining which phase (non-life or life) a given system is in requires dynamical information and therefore can only be inferred by identifying causal architecture. We discuss some potential novel research directions based on this hypothesis, including potential measures of such a transition that may be amenable to laboratory study, and how the proposed mechanism corresponds to the onset of the unique mode of (algorithmic) information processing characteristic of living systems.Comment: 13 pages, 1 tabl
    corecore