1,509,011 research outputs found

    MARKETING MECHANISMS TO FACILITATE CO-EXISTENCE OF GM AND NON-GM CROPS

    Get PDF
    Development of genetically modified (GM) and specialty crops has had a great impact on the grain handling industry during recent years. Added costs associated with handling these crops have become an important issue for grain handlers. For this study, data were collected from a survey of elevators in the Upper Midwest. The information focused on segregation practices, time requirements, and costs. This study shows the different costs (grading and handling) associated with segregation practices at the grain-handler level. The results revealed that the cost of modifying systems to handle GM is of major importance. A stochastic simulation model of an engineering cost function is developed to analyze costs for segregation and testing using results from the survey. Assuming no modification is required, the total cost of segregation is about 10 cents per bushel. The volume of grain tested also impacts the total segregation cost per bushel. Finally, the gross elevator margin and the premium for quality seem to be large enough to offset the increase in handling costs due to these new segregation practices.Genetically modified crops, identity preservation, segregation, Crop Production/Industries,

    Suspended microchannel resonators for ultralow volume universal detection

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2008.Includes bibliographical references (leaves 32-33).Universal detectors that maintain high sensitivity as the detection volume is reduced to the sub-nanoliter scale can enhance the utility of miniaturized total analysis systems ([mu]-TAS). Here the unique scaling properties of the suspended microchannel resonator (SMR) are exploited to show universal detection in a 10 pL analysis volume with a density detection limit of ~1 ([mu]g/cmÂł (10 Hz bandwidth) and a linear dynamic range of six decades. Analytes with low UV extinction coefficients such as polyethylene glycol (PEG) 8 KDa, glucose, and glycine are measured with molar detection limits of 0.66 ([mu]M, 13.5 ([mu]M, and 31.6 ([mu]M, respectively. To demonstrate the potential for real-time monitoring, gel filtration chromatography was used to separate different molecular weights of PEG as the SMR acquired a chromatogram by measuring the eluate density. This work suggests that the SMR could offer a simple and sensitive universal detector for various separation systems from liquid chromatography to capillary electrophoresis. Moreover, since the SMR is itself a microfluidic channel, it can be directly integrated into ([mu]-TAS without compromising overall performance.by Sungmin Son.S.M

    Hearing the Hidden Agenda: The Ethnographic Investigation of Procedure

    Get PDF
    Laser Doppler flowmetry (LDF) is virtually the only non-invasive technique, except for other laser speckle based techniques, that enables estimation of the microcirculatory blood flow. The technique was introduced into the field of biomedical engineering in the 1970s, and a rapid evolvement followed during the 1980s with fiber based systems and improved signal analysis. The first imaging systems were presented in the beginning of the 1990s. Conventional LDF, although unique in many aspects and elegant as a method, is accompanied by a number of limitations that may have reduced the clinical impact of the technique. The analysis model published by Bonner and Nossal in 1981, which is the basis for conventional LDF, is limited to measurements given in arbitrary and relative units, unknown and non-constant measurement volume, non-linearities at increased blood tissue fractions, and a relative average velocity estimate. In this thesis a new LDF analysis method, quantitative LDF, is presented. The method is based on recent models for light-tissue interaction, comprising the current knowledge of tissue structure and optical properties, making it fundamentally different from the Bonner and Nossal model. Furthermore and most importantly, the method eliminates or highly reduces the limitations mentioned above. Central to quantitative LDF is Monte Carlo (MC) simulations of light transport in tissue models, including multiple Doppler shifts by red blood cells (RBC). MC was used in the first proof-of-concept study where the principles of the quantitative LDF were tested using plastic flow phantoms. An optically and physiologically relevant skin model suitable for MC was then developed. MC simulations of that model as well as of homogeneous tissue relevant models were used to evaluate the measurement depth and volume of conventional LDF systems. Moreover, a variance reduction technique enabling the reduction of simulation times in orders of magnitudes for imaging based MC setups was presented. The principle of the quantitative LDF method is to solve the reverse engineering problem of matching measured and calculated Doppler power spectra at two different source-detector separations. The forward problem of calculating the Doppler power spectra from a model is solved by mixing optical Doppler spectra, based on the scattering phase functions and the velocity distribution of the RBC, from various layers in the model and for various amounts of Doppler shifts. The Doppler shift distribution is calculated based on the scattering coefficient of the RBC:s and the path length distribution of the photons in the model, where the latter is given from a few basal MC simulations. When a proper spectral matching is found, via iterative model parameters updates, the absolute measurement data are given directly from the model. The concentration is given in g RBC/100 g tissue, velocities in mm/s, and perfusion in g RBC/100 g tissue × mm/s. The RBC perfusion is separated into three velocity regions, below 1 mm/s, between 1 and 10 mm/s, and above 10 mm/s. Furthermore, the measures are given for a constant output volume of a 3 mm3 half sphere, i.e. within 1.13 mm from the light emitting fiber of the measurement probe. The quantitative LDF method was used in a study on microcirculatory changes in type 2 diabetes. It was concluded that the perfusion response to a local increase in skin temperature, a response that is reduced in diabetes, is a process involving only intermediate and high flow velocities and thus relatively large vessels in the microcirculation. The increased flow in higher velocities was expected, but could not previously be demonstrated with conventional LDF. The lack of increase in low velocity flow indicates a normal metabolic demand during heating. Furthermore, a correlation between the perfusion at low and intermediate flow velocities and diabetes duration was found. Interestingly, these correlations were opposites (negative for the low velocity region and positive for the mediate velocity region). This finding is well in line with the increased shunt flow and reduced nutritive capillary flow that has previously been observed in diabetes

    Regenerative fuel cell energy storage system for a low earth orbit space station

    Get PDF
    A study was conducted to define characteristics of a Regenerative Fuel Cell System (RFCS) for low earth orbit Space Station missions. The RFCS's were defined and characterized based on both an alkaline electrolyte fuel cell integrated with an alkaline electrolyte water electrolyzer and an alkaline electrolyte fuel cell integrated with an acid solid polymer electrolyte (SPE) water electrolyzer. The study defined the operating characteristics of the systems including system weight, volume, and efficiency. A maintenance philosophy was defined and the implications of system reliability requirements and modularization were determined. Finally, an Engineering Model System was defined and a program to develop and demonstrate the EMS and pacing technology items that should be developed in parallel with the EMS were identified. The specific weight of an optimized RFCS operating at 140 F was defined as a function of system efficiency for a range of module sizes. An EMS operating at a nominal temperature of 180 F and capable of delivery of 10 kW at an overall efficiency of 55.4 percent is described. A program to develop the EMS is described including a technology development effort for pacing technology items

    CLOSED-LOOP CONVEYOR SYSTEMS WITH MULTIPLE POISSON INPUT AND MULTIPLE SERVERS.

    Get PDF
    Dept. of Industrial and Manufacturing Systems Engineering. Paper copy at Leddy Library: Theses & Major Papers - Basement, West Bldg. / Call Number: Thesis1976 .S38. Source: Dissertation Abstracts International, Volume: 37-10, Section: B, page: 5279. Thesis (Ph.D.)--University of Windsor (Canada), 1976

    Development of a method for reliable power input measurements in conventional and single-use stirred bioreactors at laboratory scale

    Get PDF
    Power input is an important engineering and scale-up/down criterion in stirred bioreactors. However, reliably measuring power input in laboratory-scale systems is still challenging. Even though torque measurements have proven to be suitable in pilot scale systems, sensor accuracy, resolution, and errors from relatively high levels of friction inside bearings can become limiting factors at smaller scales. An experimental setup for power input measurements was developed in this study by focusing on stainless steel and single-use bioreactors in the single-digit volume range. The friction losses inside the air bearings were effectively reduced to less than 0.5% of themeasurement range of the torque meter. A comparison of dimensionless power numbers determined for a reference Rushton turbine stirrer (N-P = 4.17 +/- 0.14 for fully turbulent conditions) revealed good agreement with literature data. Hence, the power numbers of several reusable and single-use bioreactors could be determined over a wide range of Reynolds numbers between 100 and > 10(4). Power numbers of between 0.3 and 4.5 (for Re = 10(4)) were determined for the different systems. The rigid plastic vessels showed similar power characteristics to their reusable counterparts. Thus, it was demonstrated that the torque-based technique can be used to reliably measure power input in stirred reusable and single-use bioreactors at the laboratory scale

    CORE Hinge Testing Phase results of the Delft Deployable Space Telescope

    Get PDF
    The Deployable Space Telescope (DST), being developed at the Delft University of Technology, aims to drastically reduce volume and mass by using innovative deployable optics. The DST overall systems design is driven by a strict bottom-up versus top-down systems engineering approach. One of the critical subsystems is the secondary mirror (M2) support: its position must be accurate to 10 ÎĽm and stable to sub-micron levels. To support this critical budget the development and testing of the first key DST hardware comprises a machined COmpliant Rolling Element (CORE) hinge design. The main benefit of the hinge is its very low hysteresis, which will translate to good deployment repeatability and eliminate micro dynamic instabilities. The hysteresis was tested experimentally by a technique called Digital Image Correlation (DIC), which was proven to have resolutions down to 100 nm. Different strip configurations were applied to investigate the empirical optimisation of the hinge. The maximum hysteresis found was 0.3 ÎĽm with load cycles up to 400 N

    Geothermal systems simulation: A case study

    Get PDF
    Geothermal reservoir simulation is a key step for developing sustainable and efficient strategies for the exploitation of geothermal resources. It is applied in the assessment of several areas of reservoir engineering, such as reservoir performance and re-injection programs, pressure decline in depletion, phase transition conditions, and natural evolution of hydrothermal convection systems. Fluid flow and heat transfer in rock masses, fluid-rock chemical interaction and rock mass deformation are some of the processes addressed in reservoir modelling. The case study of the Las Tres Virgenes (LTV) geothermal field (10 MWe), Baja California Sur, Mexico is presented. Three dimensional (3D) natural state simulations were carried out from emplacement and cooling of two spherical magma chambers using a conductive approach. A conceptual model of the volcanic system was developed on a lithostratigraphic and geochronological basis. Magma chamber volumes were established from eruptive volumes estimations. The thermophysical properties of the medium were assumed to correspond to the dominant rock in each lithological unit as an initial value, and further calibration was made considering histograms of experimentally obtained thermophysical properties of rocks. As the boundaries of the model lie far from the thermal anomaly, we assumed specified temperature boundaries. A Finite Volume (FV) numerical scheme was implemented in a Fortran 90 code to solve the heat equation. Static formation temperatures from well logs were used for validation of the numerical results. Good agreement was observed in those geothermal wells dominated by conductive heat transfer. For other wells, however, it is clear that conduction alone cannot explain observed behaviour, three-dimensional convective models are being implemented for future multiphysics simulations

    The medical science DMZ: a network design pattern for data-intensive medical science

    Get PDF
    Abstract: Objective We describe a detailed solution for maintaining high-capacity, data-intensive network flows (eg, 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations. Materials and Methods High-end networking, packet-filter firewalls, network intrusion-detection systems. Results We describe a “Medical Science DMZ” concept as an option for secure, high-volume transport of large, sensitive datasets between research institutions over national research networks, and give 3 detailed descriptions of implemented Medical Science DMZs. Discussion The exponentially increasing amounts of “omics” data, high-quality imaging, and other rapidly growing clinical datasets have resulted in the rise of biomedical research “Big Data.” The storage, analysis, and network resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large datasets. Maintaining data-intensive flows that comply with the Health Insurance Portability and Accountability Act (HIPAA) and other regulations presents a new challenge for biomedical research. We describe a strategy that marries performance and security by borrowing from and redefining the concept of a Science DMZ, a framework that is used in physical sciences and engineering research to manage high-capacity data flows. Conclusion By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements

    Mathematical Programming Model for Procurement Selection in Water Irrigation Systems. A Case Study

    Full text link
    [EN] The development tools to optimize the process and helping management to get margin are used inside of the industrial manufacture. Water networks management are not alien to this need. The optimization of the water resource is currently done in big basins, but it is not a general practice in irrigation networks that operate as water distribution companies to supply the farmers¿ demand. Nowadays, this management is not optimized and the costs are not minimized. This research introduces a mathematical programming model to optimize the replenishment process in a local irrigation network with the aim to decide what volume is procured (source, quantity and timetable) as well as what volume is stored while minimising the involved total costs. The final objective is to improve the sustainability of the water systems. The use of this tool reduces the water costs in 52.2% as well as enables to define the necessary source and the electrical schedule along the year. This definition optimizes the operating of the water system and enables to reduce the water price from 0.23 €/m3 (current water management) to 0.11 €/m3 (proposed model).Pérez-Sánchez, M.; Díaz-Madroñero Boluda, FM.; López Jiménez, PA.; Mula, J. (2017). Mathematical Programming Model for Procurement Selection in Water Irrigation Systems. A Case Study. Journal of Engineering Science and Technology Review (Online). 10(6):146-153. doi:10.25103/jestr.106.17S14615310
    • …
    corecore