1,508 research outputs found

    The mass distribution in an assembling super galaxy group at z=0.37z=0.37

    Get PDF
    We present a weak gravitational lensing analysis of supergroup SG1120−-1202, consisting of four distinct X-ray-luminous groups, that will merge to form a cluster comparable in mass to Coma at z=0z=0. These groups lie within a projected separation of 1 to 4 Mpc and within Δv=550\Delta v=550 km s−1^{-1} and form a unique protocluster to study the matter distribution in a coalescing system. Using high-resolution {\em HST}/ACS imaging, combined with an extensive spectroscopic and imaging data set, we study the weak gravitational distortion of background galaxy images by the matter distribution in the supergroup. We compare the reconstructed projected density field with the distribution of galaxies and hot X-ray emitting gas in the system and derive halo parameters for the individual density peaks. We show that the projected mass distribution closely follows the locations of the X-ray peaks and associated brightest group galaxies. One of the groups that lies at slightly lower redshift (z≈0.35z\approx 0.35) than the other three groups (z≈0.37z\approx 0.37) is X-ray luminous, but is barely detected in the gravitational lensing signal. The other three groups show a significant detection (up to 5σ5 \sigma in mass), with velocity dispersions between 355−70+55355^{+55}_{-70} and 530−55+45530^{+45}_{-55} km s−1^{-1} and masses between 0.8−0.3+0.4×10140.8^{+0.4}_{-0.3} \times 10^{14} and 1.6−0.4+0.5×1014h−1M⊙1.6^{+0.5}_{-0.4}\times 10^{14} h^{-1} M_{\odot}, consistent with independent measurements. These groups are associated with peaks in the galaxy and gas density in a relatively straightforward manner. Since the groups show no visible signs of interaction, this supports the picture that we are catching the groups before they merge into a cluster.Comment: 10 pages, 10 figures, accepted for publication by Astronomy & Astrophysic

    Reconciling Microeconomic and Macroeconomic Estimates of Price Stickiness

    Get PDF
    This paper attempts to reconcile the high estimates of price stickiness from macroeconomic estimates of a New-Keynesian Phillips Curve (NKPC) with the lower values obtained from surveys of firms’ pricing behaviour. This microeconomic evidence also suggests that the frequency with which firms adjust their prices varies across sectors. The paper shows that in the presence of this heterogeneity, estimates of aggregate price stickiness from microeconomic and macroeconomic data should differ. Heterogeneity in firms’ pricing decisions, as well as a more realistic production structure, is introduced into an otherwise standard New-Keynesian model. Using a model calibrated with microeconomic pricing survey data for Australia, the paper shows that estimates of the NKPC considerably overstate the true degree of price stickiness and may falsely suggest that some prices are indexed to past inflation. These problems arise because of a type of misspecification and a lack of suitable instruments.separate by New-Keynesian Phillips Curve; inflation

    Human Modeling For Ground Processing Human Factors Engineering Analysis

    Get PDF
    There have been many advancements and accomplishments over that last few years using human modeling for human factors engineering analysis for design of spacecraft and launch vehicles. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the different types of human modeling used currently and in the past at Kennedy Space Center (KSC) currently, and to explain the future plans for human modeling for future spacecraft designs

    Interactive scheduling of appliance usage in the home

    No full text
    We address the problem of recommending an appliance usage schedule to the homeowner which balances between maximising total savings and maintaining sufficient user convenience. An important challenge within this problem is how to elicit the user preferences with low intrusiveness, in order to identify new schedules with high cost savings, that still lies within the user’s comfort zone. To tackle this problem we propose iDR, an interactive system for generating personalised appliance usage scheduling recommendations that maximise savings and convenience with minimal intrusiveness. In particular, our system learns when to stop interacting with the user during the preference elicitation process, in order to keep the bother cost (e.g., the amount of time the user spends, or the cognitive cost of interacting) minimal. We demonstrate through extensive empirical evaluation on real–world data that our approach improves savings by up to 35%, while maintaining a significantly lower bother cost, compared to state-of the-art benchmarks

    Der Weg zur effizienten Belichtungsregelung in Algen-Photobioreaktoren

    Get PDF
    Zur effizienten Belichtungsregelung in Algen-Photobioreaktoren mĂŒssen viele Regelparameter aufeinander abgestimmt werden. Neben den aktuellen Lichtbedingungen sorgen Vorhersagen der zu erwartenden Tageslichtmenge sowie der Energiepreise fĂŒr langfristig energieeffiziente Regelstrategien im Algenanbau. Maßgebende MessgrĂ¶ĂŸe fĂŒr das Tageslicht ist hierbei die photosynthetische Photonenflussdichte (PPFD). In diesem Paper werden drei Methoden zur Bestimmung der PPFD von Tageslichtspektren mit kostengĂŒnstigen Spektralsensoren vorgestellt. Die erste Methode schĂ€tzt die PPFD anhand der Kanalempfindlichkeitskurven. Bei der zweiten Methode wird die PPFD auf der Grundlage der berechneten Ă€hnlichsten Farbtemperatur (CCT) und einer spektralen Rekonstruktion unter Verwendung des CIE-Tageslichtmodells berechnet. Und die dritte Methode basiert auf einem Regressionsmodell zur Berechnung der PPFD. Es wird gezeigt, dass die tatsĂ€chlichen Tageslichtspektren zu stark vom CIE-Tageslichtmodell abweichen, um eine hinreichende Aussage zur PPFD zu treffen. Abschließend erfolgt ein Test der Robustheit dieser Methoden anhand von realen Messdaten, die mit den Sensoren im Freien bei verschiedenen TageslichtverhĂ€ltnissen erzeugt wurden

    Therapeutic turnaround times for common laboratory tests in a tertiary hospital in Kenya

    Get PDF
    Access to efficient laboratory services is critical to patient care. Turnaround Time (TAT) is one of the most important measures when judging the efficiency of any laboratory and care system. Few studies on TAT exist for inpatient care settings within low- and middle-income countries (LMICs). Methods We evaluated therapeutic TAT for a tertiary hospital in Western Kenya, using a time-motion study focusing specifically on common hematology and biochemistry orders. The aim was to determine significant bottlenecks in diagnostic testing processes at the institution. Results A total of 356 (155 hematology and 201 biochemistry) laboratory tests were fully tracked from the time of ordering to availability of results to care providers. The total therapeutic TAT for all tests was 21.5 ± 0.249 hours (95% CI). The therapeutic TAT for hematology was 20.3 ± 0.331 hours (95% CI) while that for biochemistry tests was 22.2 ± 0.346 hours (95% CI). Printing, sorting and dispatch of the printed results emerged as the most significant bottlenecks, accounting for up to 8 hours of delay (Hematology—8.3 ± 1.29 hours (95% CI), Biochemistry—8.5 ± 1.18 hours (95% CI)). Time of test orders affected TAT, with orders made early in the morning and those in the afternoon experiencing the most delays in TAT. Conclusion Significant inefficiencies exist at multiple steps in the turnaround times for routine laboratory tests at a large referral hospital within an LMIC setting. Multiple opportunities exist to improve TAT and streamline processes around diagnostic testing in this and other similar settings.publishedVersio
    • 

    corecore