709 research outputs found

    Detection of HC11N in the Cold Dust Cloud TMC-1

    Get PDF
    Two consecutive rotational transitions of the long cyanopolyyne HC11N, J=39-38, and J=38-37, have been detected in the cold dust cloud TMC-1 at the frequencies expected from recent laboratory measurements by Travers et al. (1996), and at about the expected intensities. The astronomical lines have a mean radial velocity of 5.8(1) km/s, in good agreement with the shorter cyanopolyynes HC7N and HC9N observed in this very sharp-lined source [5.82(5) and 5.83(5) km/s, respectively]. The column density of HC11N is calculated to be 2.8x10^(11) cm^(-2). The abundance of the cyanopolyynes decreases smoothly with length to HC11N, the decrement from one to the next being about 6 for the longer carbon chains.Comment: plain tex 10 pages plus 3 ps fig file

    Metal-semiconductor-metal photodetectors on a GeSn-on-insulator platform for 2 µm applications

    Get PDF
    In this work, the metal-semiconductor-metal photodetectors were demonstrated on the Ge0.91Sn0.09-on-insulator (GeSnOI) platform. The responsivity was 0.24 and 0.06 A/W at wavelengths of 1,600 and 2,003 nm, respectively. Through a systematic study, it is revealed that the photodetectors can potentially detect wavelength beyond 2,200 nm. The dark current density was measured to be 4.6 A/cm2 for GeSnOI waveguide-shaped photodetectors. The 3 dB bandwidth was observed to be 1.26 and 0.81 GHz at 1,550 and 2,000 nm wavelengths, respectively. This work opens up an opportunity for low-cost 2 µm wavelength photodetection on the GeSn/Ge interface-free GeSnOI platform

    Assessing the relevance of digital elevation models to evaluate glacier mass balance : application to Austre Lovénbreen (Spitsbergen, 79 ° N)

    Get PDF
    International audienceThe volume variation of a glacier is the actual indicator of long term and short term evolution of the glacier behaviour. In order to assess the volume evolution of the Austre Lovénbreen (79 • N) over the last 47 years, we used multiple historical datasets, complemented with our high density GPS tracks acquired in 2007 and 2010. The improved altitude resolution of recent measurement techniques, including phase corrected GPS and LiDAR, reduces the time interval between datasets used for volume subtraction in order to compute the mass balance. We estimate the sub-metre elevation accuracy of most recent measurement techniques to be sufficient to record ice thickness evolutions occurring over a 3 year duration at polar latitudes. The systematic discrepancy between ablation stake measurements and DEM analysis, widely reported in the literature as well as in the current study, yields new questions concerning the similarity and relationship between these two measurement methods. The use of Digital Elevation Model (DEM) has been an attractive alternative measurement technique to estimate glacier area and volume evolution over time with respect to the classical in situ measurement techniques based on ablation stakes. With the availability of historical datasets, whether from ground based maps, aerial photography or satellite data acquisition, such a glacier volume estimate strategy allows for the extension of the analysis duration beyond the current research programmes. Furthermore, these methods do provide a continuous spatial coverage defined by its cell size whereas interpolations based on a limited number of stakes display large spatial uncertainties. In this document, we focus on estimating the altitude accuracy of various datasets acquired between 1962 and 2010, using various techniques ranging from topographic maps to dual frequency skidoo-tracked GPS receivers and the classical aerial and satellite photogrammetric techniques

    Trio: enabling sustainable and scalable outdoor wireless sensor network deployments

    Get PDF

    Characterizing the Initial Phase of Epidemic Growth on some Empirical Networks

    Full text link
    A key parameter in models for the spread of infectious diseases is the basic reproduction number R0R_0, which is the expected number of secondary cases a typical infected primary case infects during its infectious period in a large mostly susceptible population. In order for this quantity to be meaningful, the initial expected growth of the number of infectious individuals in the large-population limit should be exponential. We investigate to what extent this assumption is valid by performing repeated simulations of epidemics on selected empirical networks, viewing each epidemic as a random process in discrete time. The initial phase of each epidemic is analyzed by fitting the number of infected people at each time step to a generalised growth model, allowing for estimating the shape of the growth. For reference, similar investigations are done on some elementary graphs such as integer lattices in different dimensions and configuration model graphs, for which the early epidemic behaviour is known. We find that for the empirical networks tested in this paper, exponential growth characterizes the early stages of the epidemic, except when the network is restricted by a strong low-dimensional spacial constraint, such as is the case for the two-dimensional square lattice. However, on finite integer lattices of sufficiently high dimension, the early development of epidemics shows exponential growth.Comment: To be included in the conference proceedings for SPAS 2017 (International Conference on Stochastic Processes and Algebraic Structures), October 4-6, 201

    A Comparison of Methods to Communicate Treatment Preferences in Nursing Facilities: Traditional Practices versus the Physician Orders for Life-Sustaining Treatment (POLST) Program

    Get PDF
    Background Traditional methods to communicate life-sustaining treatment preferences are largely ineffective. The Physician Orders for Life-Sustaining Treatment (POLST) Program offers an alternative approach, but comparative data are lacking. Objectives To evaluate the relationship between communication methods (POLST versus traditional practices) and documentation of life-sustaining treatment orders, symptom assessment and management, and use of life-sustaining treatments. Design Retrospective observational cohort study conducted between June 2006 and April 2007. Setting A stratified, random sample of 90 Medicaid-eligible nursing facilities in Oregon, Wisconsin, and West Virginia. Subjects 1711 living and deceased nursing facility residents aged 65 and older with a minimum 60-day stay. Measurements Life-sustaining treatment orders; pain, shortness of breath, and related treatments over a 7-day period; and use of life-sustaining treatments over a 60-day period. Results POLST users were more likely to have orders about life-sustaining treatment preferences beyond CPR than non-POLST users (98.0% vs. 16.1%, P<.001). There were no differences between POLST users and non-users in symptom assessment or management. POLST users with orders for Comfort Measures Only were less likely to receive medical interventions (e.g., hospitalization) than residents with POLST Full Treatment orders (P=.004), residents with Traditional DNR orders (P<.001), or residents with Traditional Full Code orders (P<.001). Conclusion POLST users were more likely to have treatment preferences documented as medical orders than non-POLST users but there were no differences in symptom management or assessment. POLST orders restricting medical interventions were associated with the lower use of life-sustaining treatments. Findings suggest the POLST program offers significant advantages over traditional methods to communicate preferences about life-sustaining treatments

    Integration of Expert Systems and Neural Networks for the Control of Fermentation Processes

    Get PDF
    Expert systems and neural networks are new tools for the control of fermentation processes. With expert systems the fermentation plant and the process itself is modelled via a generalized, qualitative system description based on the experience of human experts. On the other hand neural networks and interpolating associative memories can learn the process behaviour directly by process observation. The paper at hand reports, how both control techniques can be combined for purposes like process supervision, modelling and optimization of biological plants

    A Practical Algorithm for General Large Scale Nonlinear Optimization Problems

    Get PDF
    We provide an effective and efficient implementation of a sequential quadratic programming (SQP) algorithm for the general large scale nonlinear programming problem. In this algorithm the quadratic programming subproblems are solved by an interior point method that can be prematurely halted by a trust region constraint. Numerous computational enhancements to improve the numerical performance are presented. These include a dynamic procedure for adjusting the merit function parameter and procedures for adjusting the trust region radius. Numerical results and comparisons are presented
    • …
    corecore