634 research outputs found

    Design of Bridge Approach Slabs

    Get PDF

    Collective oscillations of a trapped Fermi gas near the unitary limit

    Full text link
    We calculate the oscillation frequencies of trapped Fermi condensate with particular emphasis on the equation of state of the interacting Fermi system. We confirm Stringari's finding that the frequencies are independent of the interaction in the unitary limit, and we extend the theory away from that limit, where the interaction does affect the frequencies of the compressional modes only.Comment: 4 pages, corrected a couple of trivial mistakes in table II and the related text and added reference

    Psychological sources of response effects in self-administered and telephone surveys

    Full text link
    The impact of mode of data collection (self-administered questionnaire vs. telephone interview) on the emergence of response effects and the accuracy of recall from memory was explored in a cross-cultural experiment, conducted in the U.S. and the Federal Republic of Germany. As predicted on the basis of psychological considerations, question order effects were obtained under telephone interview conditions but not under self-administered conditions, where question order is eliminated by the opportunity to browse back and forth through the questionnaire. On the other hand, the impact of the content of related questions was more pronounced under self-administered than under telephone interview conditions, independent of the order in which they were presented. This reflects respondents' differential opportunity to elaborate on related questions under both administration modes, as well as the necessity to rely on the content of presumably related questions in determining the meaning of ambiguous questions under self-administered conditions. Finally, respondents' recall of the date of public events was more accurate under self-administered than under telephone interview conditions, reflecting the beneficial effect of having sufficient time to work on the recall task

    The Effect of Charge Display on Cost of Care and Physician Practice Behaviors: A Systematic Review

    Get PDF
    BACKGROUND: While studies have been published in the last 30 years that examine the effect of charge display during physician decision-making, no analysis or synthesis of these studies has been conducted. OBJECTIVE: We aimed to determine the type and quality of charge display studies that have been published; to synthesize this information in the form of a literature review. METHODS: English-language articles published between 1982 and 2013 were identified using MEDLINE, Web of Knowledge, ABI-Inform, and Academic Search Premier. Article titles, abstracts, and text were reviewed for relevancy by two authors. Data were then extracted and subsequently synthesized and analyzed. RESULTS: Seventeen articles were identified that fell into two topic categories: the effect of charge display on radiology and laboratory test ordering versus on medication choice. Seven articles were randomized controlled trials, eight were pre-intervention vs. post-intervention studies, and two interventions had a concurrent control and intervention groups, but were not randomized. Twelve studies were conducted in a clinical environment, whereas five were survey studies. Of the nine clinically based interventions that examined test ordering, seven had statistically significant reductions in cost and/or the number of tests ordered. Two of the three clinical studies looking at medication expenditures found significant reductions in cost. In the survey studies, physicians consistently chose fewer tests or lower cost options in the theoretical scenarios presented. CONCLUSIONS: In the majority of studies, charge information changed ordering and prescribing behavior

    Swain Committee Report

    Get PDF
    Letter addressed to the Secretary of the Navy, the Honorable Josephus H. Daniels, from the Committee appointed by the President of the Society for the Promotion of Engineering Education, to visit the U.S. Naval Academy in Annapolis, MD. The purpose was to evaluate the work of the Post Graduate School. The committee recommended enlarging the enrollment of the Post Graduate School and providing for appropriate funding for buildings, equipment and curricula

    Book Reviews

    Get PDF

    Probabilistic classification of acute myocardial infarction from multiple cardiac markers

    Get PDF
    Logistic regression and Gaussian mixture model (GMM) classifiers have been trained to estimate the probability of acute myocardial infarction (AMI) in patients based upon the concentrations of a panel of cardiac markers. The panel consists of two new markers, fatty acid binding protein (FABP) and glycogen phosphorylase BB (GPBB), in addition to the traditional cardiac troponin I (cTnI), creatine kinase MB (CKMB) and myoglobin. The effect of using principal component analysis (PCA) and Fisher discriminant analysis (FDA) to preprocess the marker concentrations was also investigated. The need for classifiers to give an accurate estimate of the probability of AMI is argued and three categories of performance measure are described, namely discriminatory ability, sharpness, and reliability. Numerical performance measures for each category are given and applied. The optimum classifier, based solely upon the samples take on admission, was the logistic regression classifier using FDA preprocessing. This gave an accuracy of 0.85 (95% confidence interval: 0.78–0.91) and a normalised Brier score of 0.89. When samples at both admission and a further time, 1–6 h later, were included, the performance increased significantly, showing that logistic regression classifiers can indeed use the information from the five cardiac markers to accurately and reliably estimate the probability AMI

    Core-Shell Hydrogel Particles Harvest, Concentrate and Preserve Labile Low Abundance Biomarkers

    Get PDF
    Background: The blood proteome is thought to represent a rich source of biomarkers for early stage disease detection. Nevertheless, three major challenges have hindered biomarker discovery: a) candidate biomarkers exist at extremely low concentrations in blood; b) high abundance resident proteins such as albumin mask the rare biomarkers; c) biomarkers are rapidly degraded by endogenous and exogenous proteinases. Methodology and Principal Findings: Hydrogel nanoparticles created with a N-isopropylacrylamide based core (365 nm)-shell (167 nm) and functionalized with a charged based bait (acrylic acid) were studied as a technology for addressing all these biomarker discovery problems, in one step, in solution. These harvesting core-shell nanoparticles are designed to simultaneously conduct size exclusion and affinity chromatography in solution. Platelet derived growth factor (PDGF), a clinically relevant, highly labile, and very low abundance biomarker, was chosen as a model. PDGF, spiked in human serum, was completely sequestered from its carrier protein albumin, concentrated, and fully preserved, within minutes by the particles. Particle sequestered PDGF was fully protected from exogenously added tryptic degradation. When the nanoparticles were added to a 1 mL dilute solution of PDGF at non detectable levels (less than 20 picograms per mL) the concentration of the PDGF released from the polymeric matrix of the particles increased within the detection range of ELISA and mass spectrometry. Beyond PDGF, the sequestration and protection from degradation for a series of additional very low abundance and very labile cytokines were verified. Conclusions and Significance: We envision the application of harvesting core-shell nanoparticles to whole blood for concentration and immediate preservation of low abundance and labile analytes at the time of venipuncture. © 2009 Longo et al

    Monitoring international migration flows in Europe. Towards a statistical data base combining data from different sources

    Get PDF
    The paper reviews techniques developed in demography, geography and statistics that are useful for bridging the gap between available data on international migration flows and the information required for policy making and research. The basic idea of the paper is as follows: to establish a coherent and consistent data base that contains sufficiently detailed, up-to-date and accurate information, data from several sources should be combined. That raises issues of definition and measurement, and of how to combine data from different origins properly. The issues may be tackled more easily if the statistics that are being compiled are viewed as different outcomes or manifestations of underlying stochastic processes governing migration. The link between the processes and their outcomes is described by models, the parameters of which must be estimated from the available data. That may be done within the context of socio-demographic accounting. The paper discusses the experience of the U.S. Bureau of the Census in combining migration data from several sources. It also summarizes the many efforts in Europe to establish a coherent and consistent data base on international migration. The paper was written at IIASA. It is part of the Migration Estimation Study, which is a collaborative IIASA-University of Groningen project, funded by the Netherlands Organization for Scientific Research (NWO). The project aims at developing techniques to obtain improved estimates of international migration flows by country of origin and country of destination

    How Ordinary Elimination Became Gaussian Elimination

    Get PDF
    Newton, in notes that he would rather not have seen published, described a process for solving simultaneous equations that later authors applied specifically to linear equations. This method that Euler did not recommend, that Legendre called "ordinary," and that Gauss called "common" - is now named after Gauss: "Gaussian" elimination. Gauss's name became associated with elimination through the adoption, by professional computers, of a specialized notation that Gauss devised for his own least squares calculations. The notation allowed elimination to be viewed as a sequence of arithmetic operations that were repeatedly optimized for hand computing and eventually were described by matrices.Comment: 56 pages, 21 figures, 1 tabl
    • …
    corecore