12,681 research outputs found

    Modeling of grating assisted standing wave microresonators for filter applications in integrated optics

    Get PDF
    A wide, multimode segment of a dielectric optical waveguide, enclosed by Bragg reflectors and evanescently coupled to adjacent port waveguides, can constitute the cavity in an integrated optical microresonator. It turns out that the device can be described adequately in terms of an approximate coupled mode theory model which involves only a few guided modes as basis fields. By reasoning along the coupled mode model, we motivate a simple design strategy for the resonator device. Rigorous two dimensional mode expansion simulations are applied to verify the predictions of the approximate model. The results exemplify the specific spectral response of the standing wave resonators. As refinements we discuss the single resonance of a device with nonsymmetrically detuned Bragg reflectors, and the cascading of two Fabry-Perot cavities, where the coupling across an intermediate shorter grating region establishes a power transfer characteristic that is suitable for an add-drop filter

    Skylab mission planning support through the use of a hybrid simulation

    Get PDF
    The manner in which a hybrid simulation was used in support of Skylab operations in the area of dynamics and control is described. Simulation results were used in the development of acceptable vehicle maneuvers and in the verification of acceptability when the maneuvers were integrated into daily flight plans. The criterion of acceptability was based on vehicle controllability and the minimization of thruster system propellant usage. A simulation of a representative daily flight plan containing three experimental maneuvers is included, along with thruster attitude control system propellant usage tables which show predicted and actual usage for each mission. The inherent characteristics of quick turnaround and flexibility afforded by the hybrid computer proved invaluable in the operations support required throughout the Skylab mission

    Chandra X-ray Observations of Galaxies in an Off-Center Region of the Coma Cluster

    Full text link
    We have performed a pilot Chandra survey of an off-center region of the Coma cluster to explore the X-ray properties and Luminosity Function of normal galaxies. We present results on 13 Chandra-detected galaxies with optical photometric matches, including four spectroscopically-confirmed Coma-member galaxies. All seven spectroscopically confirmed giant Coma galaxies in this field have detections or limits consistent with low X-ray to optical flux ratios (fX/fR < 10^-3). We do not have sufficient numbers of X-ray detected galaxies to directly measure the galaxy X-ray Luminosity Function (XLF). However, since we have a well-measured optical LF, we take this low X-ray to optical flux ratio for the 7 spectroscopically confirmed galaxies to translate the optical LF to an XLF. We find good agreement with Finoguenov et al. (2004), indicating that the X-ray emission per unit optical flux per galaxy is suppressed in clusters of galaxies, but extends this work to a specific off-center environment in the Coma cluster. Finally, we report the discovery of a region of diffuse X-ray flux which might correspond to a small group interacting with the Coma Intra-Cluster Medium (ICM).Comment: Accepted for publication in the Astrophysical Journa

    Computer-assisted ex vivo, normothermic small bowel perfusion

    Get PDF
    Background: In the present study, a technique for computer-assisted, normothermic, oxygenated, ex vivo, recirculating small bowel perfusion was established as a tool to investigate organ pretreatment protocols and ischemia/reperfusion phenomena. A prerequisite for the desired setup was an organ chamber for ex vivo perfusion and the use of syngeneic whole blood as perfusate. Methods: The entire small bowel was harvested from Lewis rats and perfused in an organ chamber ex vivo for at least 2 h. The temperature was kept at 37 degrees C in a water bath. Three experimental groups were explored, characterized by different perfusion solutions. The basic perfusate consisted of syngeneic whole blood diluted with either NaCl, Krebs' solution or Krebs' solution and norepinephrine to a hematocrit of 30%. In addition, in each group l-glutamine was administered intraluminally. The desired perfusion pressure was 100 mm Hg which was kept constant with a computer-assisted data acquisition software, which measured an-line pressure, oxygenation, flow, temperature and pH and adjusted the pressure by changing the flow via a peristaltic pump. The viability of the preparation was tested by measuring oxygen consumption and maltose absorption, which requires intact enzymes of the mucosal brush border to break down maltose into glucose. Results: Organ perfusion in group 1 (dilution with NaCl) revealed problems such as hypersecretion into the bowel lumen, low vascular resistance and no maltose uptake. In contrast a viable organ could be demonstrated using Krebs' solution as dilution solution. The addition of norepinephrine led to an improved perfusion over the entire perfusion period. Maltose absorption was comparable to tests conducted with native small bower. Oxygen consumption was stable during the 2-hour perfusion period. Conclusions: The ex vivo perfusion system established enables small bowel perfusion for at least 2 h. The viability of the graft could be demonstrated. The perfusion time achieved is sufficient to study leukocyte/lymphocyte interaction with the endothelium of the graft vessels. In addition, a viable small bowel, after 2 h of ex vivo perfusion, facilitates testing of pretreatment protocols for the reduction of the immunogenicity of small bowel allografts. Copyright (C) 2000 S. Karger AG, Basel

    Ascent control studies of the 049 and ATP parallel burn solid rocket motor shuttle configurations

    Get PDF
    The control authority approach is discussed as a major problem of the parallel burn soil shuttle configuration due to the many resulting system impacts regardless of the approach. The major trade studies and their results, which led to the recommendation of an SRB TVC control authority approach are presented

    Classifying seismic waveforms from scratch: a case study in the alpine environment

    Get PDF
    Nowadays, an increasing amount of seismic data is collected by daily observatory routines. The basic step for successfully analyzing those data is the correct detection of various event types. However, the visually scanning process is a time-consuming task. Applying standard techniques for detection like the STA/LTA trigger still requires the manual control for classification. Here, we present a useful alternative. The incoming data stream is scanned automatically for events of interest. A stochastic classifier, called hidden Markov model, is learned for each class of interest enabling the recognition of highly variable waveforms. In contrast to other automatic techniques as neural networks or support vector machines the algorithm allows to start the classification from scratch as soon as interesting events are identified. Neither the tedious process of collecting training samples nor a time-consuming configuration of the classifier is required. An approach originally introduced for the volcanic task force action allows to learn classifier properties from a single waveform example and some hours of background recording. Besides a reduction of required workload this also enables to detect very rare events. Especially the latter feature provides a milestone point for the use of seismic devices in alpine warning systems. Furthermore, the system offers the opportunity to flag new signal classes that have not been defined before. We demonstrate the application of the classification system using a data set from the Swiss Seismological Survey achieving very high recognition rates. In detail we document all refinements of the classifier providing a step-by-step guide for the fast set up of a well-working classification syste

    Beyond deficit-based models of learners' cognition: Interpreting engineering students' difficulties with sense-making in terms of fine-grained epistemological and conceptual dynamics

    Full text link
    Researchers have argued against deficit-based explanations of students' troubles with mathematical sense-making, pointing instead to factors such as epistemology: students' beliefs about knowledge and learning can hinder them from activating and integrating productive knowledge they have. In this case study of an engineering major solving problems (about content from his introductory physics course) during a clinical interview, we show that "Jim" has all the mathematical and conceptual knowledge he would need to solve a hydrostatic pressure problem that we posed to him. But he reaches and sticks with an incorrect answer that violates common sense. We argue that his lack of mathematical sense-making-specifically, translating and reconciling between mathematical and everyday/common-sense reasoning-stems in part from his epistemological views, i.e., his views about the nature of knowledge and learning. He regards mathematical equations as much more trustworthy than everyday reasoning, and he does not view mathematical equations as expressing meaning that tractably connects to common sense. For these reasons, he does not view reconciling between common sense and mathematical formalism as either necessary or plausible to accomplish. We, however, avoid a potential "deficit trap"-substituting an epistemological deficit for a concepts/skills deficit-by incorporating multiple, context-dependent epistemological stances into Jim's cognitive dynamics. We argue that Jim's epistemological stance contains productive seeds that instructors could build upon to support Jim's mathematical sense-making: He does see common-sense as connected to formalism (though not always tractably so) and in some circumstances this connection is both salient and valued.Comment: Submitted to the Journal of Engineering Educatio

    Xenogeneic, extracorporeal liver perfusion in primates improves the ratio of branched-chain amino acids to aromatic amino acids (Fischer's ratio)

    Get PDF
    In fulminant hepatic failure (FHF), the development of hepatic encephalopathy is associated with grossly abnormal concentrations of plasma amino acids (PAA). Normalization of the ratio of branched-chain amino acids to aromatic amino acids (Fischer's ratio) correlates with clinical improvement. This study evaluated changes in PAA metabolism during 4 h of isolated, normothermic extracorporeal liver perfusion using a newly designed system containing human blood and a rhesus monkey liver. Bile and urea production were within the physiological range. Release of the transaminases AST, ALT and LDH were minimal. The ratio of branched (valine, leucine, isoleucine) to aromatic (tyrosine, phenylalanine) amino acids increased significantly. These results indicate that a xenogeneic extracorporeal liver perfusion system is capable of significantly increasing Fischer's ratio and may play a role in treating and bridging patients in FHF in the future

    Kerncraft: A Tool for Analytic Performance Modeling of Loop Kernels

    Full text link
    Achieving optimal program performance requires deep insight into the interaction between hardware and software. For software developers without an in-depth background in computer architecture, understanding and fully utilizing modern architectures is close to impossible. Analytic loop performance modeling is a useful way to understand the relevant bottlenecks of code execution based on simple machine models. The Roofline Model and the Execution-Cache-Memory (ECM) model are proven approaches to performance modeling of loop nests. In comparison to the Roofline model, the ECM model can also describes the single-core performance and saturation behavior on a multicore chip. We give an introduction to the Roofline and ECM models, and to stencil performance modeling using layer conditions (LC). We then present Kerncraft, a tool that can automatically construct Roofline and ECM models for loop nests by performing the required code, data transfer, and LC analysis. The layer condition analysis allows to predict optimal spatial blocking factors for loop nests. Together with the models it enables an ab-initio estimate of the potential benefits of loop blocking optimizations and of useful block sizes. In cases where LC analysis is not easily possible, Kerncraft supports a cache simulator as a fallback option. Using a 25-point long-range stencil we demonstrate the usefulness and predictive power of the Kerncraft tool.Comment: 22 pages, 5 figure
    • …
    corecore