126 research outputs found

    THOR 2.0: Major Improvements to the Open-Source General Circulation Model

    Get PDF
    THOR is the first open-source general circulation model (GCM) developed from scratch to study the atmospheres and climates of exoplanets, free from Earth- or Solar System-centric tunings. It solves the general non-hydrostatic Euler equations (instead of the primitive equations) on a sphere using the icosahedral grid. In the current study, we report major upgrades to THOR, building upon the work of Mendon\c{c}a et al. (2016). First, while the Horizontally Explicit Vertically Implicit (HEVI) integration scheme is the same as that described in Mendon\c{c}a et al. (2016), we provide a clearer description of the scheme and improved its implementation in the code. The differences in implementation between the hydrostatic shallow (HSS), quasi-hydrostatic deep (QHD) and non-hydrostatic deep (NHD) treatments are fully detailed. Second, standard physics modules are added: two-stream, double-gray radiative transfer and dry convective adjustment. Third, THOR is tested on additional benchmarks: tidally-locked Earth, deep hot Jupiter, acoustic wave, and gravity wave. Fourth, we report that differences between the hydrostatic and non-hydrostatic simulations are negligible in the Earth case, but pronounced in the hot Jupiter case. Finally, the effects of the so-called "sponge layer", a form of drag implemented in most GCMs to provide numerical stability, are examined. Overall, these upgrades have improved the flexibility, user-friendliness, and stability of THOR.Comment: 57 pages, 31 figures, revised, accepted for publication in ApJ

    The Role of GIS to Enable Public-Sector Decision Making Under Conditions of Uncertainty

    Get PDF
    Uncertainty is inherent in environmental planning and decision making. For example, water managers in arid regions are attuned to the uncertainty of water supply due to prolonged periods of drought. To contend with multiple sources and forms of uncertainty, resource managers implement strategies and tools to aid in the exploration and interpretation of data and scenarios. Various GIS capabilities, such as statistical analysis, modeling and visualization are available to decision makers who face the challenge of making decisions under conditions of deep uncertainty. While significant research has lead to the inclusion and representation of uncertainty in GIS, existing GIS literature does not address how decision makers implement and utilize GIS as an assistive technology to contend with deep uncertainty. We address this gap through a case study of water managers in the Phoenix Metropolitan Area, examining how they engage with GIS in making decisions and coping with uncertainty. Findings of a qualitative analysis of water mangers reveal the need to distinguish between implicit and explicit uncertainty. Implicit uncertainty is linked to the decision-making process, and while understood, it is not displayed or revealed separately from the data. In contrast, explicit uncertainty is conceived as separate from the process and is something that can be described or displayed. Developed from twelve interviews with Phoenix-area water managers in 2005, these distinctions of uncertainty clarify the use of GIS in decision making. Findings show that managers use the products of GIS for exploring uncertainty (e.g., cartographic products). Uncertainty visualization emerged as a current practice, but definitions of what constitutes such visualizations were not consistent across decision makers. Additionally, uncertainty was a common and even sometimes helpful element of decision making; rather than being a hindrance, it is seen as an essential component of the process. These findings contradict prior research relating to uncertainty visualization where decision makers often express discomfort with the presence of uncertainty.

    Exercising in the Fasted State Reduced 24-Hour Energy Intake in Active Male Adults

    Get PDF
    The effect of fasting prior to morning exercise on 24-hour energy intake was examined using a randomized, counterbalanced design. Participants (12 active, white males, 20.8±3.0 years old, VO2max:   59.1±5.7 mL/kg/min) fasted (NoBK) or received breakfast (BK) and then ran for 60 minutes at 60%  VO2max. All food was weighed and measured for 24 hours. Measures of blood glucose and hunger were collected at 5 time points. Respiratory quotient (RQ) was measured during exercise. Generalized linear mixed models and paired sample t-tests examined differences between the conditions. Total 24-hour (BK: 19172±4542 kJ versus NoBK: 15312±4513 kJ; p<0.001) and evening (BK: 12265±4278 kJ versus NoBK: 10833±4065; p=0.039) energy intake and RQ (BK: 0.90±0.03 versus NoBK: 0.86±0.03; p<0.001) were significantly higher in BK than NoBK. Blood glucose was significantly higher in BK than NoBK before exercise (5.2±0.7 versus 4.5±0.6 mmol/L; p=0.025). Hunger was significantly lower for BK than NoBK before exercise, after exercise, and before lunch. Blood glucose and hunger were not associated with energy intake. Fasting before morning exercise decreased 24-hour energy intake and increased fat oxidation during exercise. Completing exercise in the morning in the fasted state may have implications for weight management

    The THOR+HELIOS general circulation model: multi-wavelength radiative transfer with accurate scattering by clouds/hazes

    Get PDF
    General circulation models (GCMs) provide context for interpreting multi-wavelength, multi-phase data of the atmospheres of tidally locked exoplanets. In the current study, the non-hydrostatic THOR GCM is coupled with the HELIOS radiative transfer solver for the first time, supported by an equilibrium chemistry solver (FastChem), opacity calculator (HELIOS-K) and Mie scattering code (LX-MIE). To accurately treat the scattering of radiation by medium-sized to large aerosols/condensates, improved two-stream radiative transfer is implemented within a GCM for the first time. Multiple scattering is implemented using a Thomas algorithm formulation of the two-stream flux solutions, which decreases the computational time by about 2 orders of magnitude compared to the iterative method used in past versions of HELIOS. As a case study, we present four GCMs of the hot Jupiter WASP-43b, where we compare the temperature, velocity, entropy, and streamfunction, as well as the synthetic spectra and phase curves, of runs using regular versus improved two-stream radiative transfer and isothermal versus non-isothermal layers. While the global climate is qualitatively robust, the synthetic spectra and phase curves are sensitive to these details. A THOR+HELIOS WASP-43b GCM (horizontal resolution of about 4 degrees on the sphere and with 40 radial points) with multi-wavelength radiative transfer (30 k-table bins) running for 3000 Earth days (864,000 time steps) takes about 19-26 days to complete depending on the type of GPU.Comment: 31 pages, 24 figures, accepted for publication at MNRA

    A Hard X-Ray Compton Source at CBETA

    Get PDF
    Inverse Compton scattering (ICS) holds the potential for future high flux, narrow bandwidth x-ray sources driven by high quality, high repetition rate electron beams. CBETA, the Cornell-BNL Energy recovery linac (ERL) Test Accelerator, is the world’s first superconducting radiofrequency multi-turn ERL, with a maximum energy of 150 MeV, capable of ICS production of x-rays above 400 keV. We present an update on the bypass design and anticipated parameters of a compact ICS source at CBETA. X-ray parameters from the CBETA ICS are compared to those of leading synchrotron radiation facilities, demonstrating that, above a few hundred keV, photon beams produced by ICS outperform those produced by undulators in term of flux and brilliance

    Perspectives of staff nurses of the reasons for and the nature of patient-initiated call lights: an exploratory survey study in four USA hospitals

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Little research has been done on patient call light use and staff response time, which were found to be associated with inpatient falls and satisfaction. Nurses' perspectives may moderate or mediate the aforementioned relationships. This exploratory study intended to understand staff's perspectives about call lights, staff responsiveness, and the reasons for and the nature of call light use. It also explored differences among hospitals and identified significant predictors of the nature of call light use.</p> <p>Methods</p> <p>This cross-sectional, multihospital survey study was conducted from September 2008 to January 2009 in four hospitals located in the Midwestern region of the United States. A brief survey was used. All 2309 licensed and unlicensed nursing staff members who provide direct patient care in 27 adult care units were invited to participate. A total of 808 completed surveys were retrieved for an overall response rate of 35%. The SPSS 16.0 Window version was used. Descriptive and binary logistic regression analyses were conducted.</p> <p>Results</p> <p>The primary reasons for patient-initiated calls were for toileting assistance, pain medication, and intravenous problems. Toileting assistance was the leading reason. Each staff responded to 6 to 7 calls per hour and a call was answered within 4 minutes (estimated). 49% of staff perceived that patient-initiated calls mattered to patient safety. 77% agreed that that these calls were meaningful. 52% thought that these calls required the attention of nursing staff. 53% thought that answering calls prevented them from doing the critical aspects of their role. Staff's perceptions about the nature of calls varied across hospitals. Junior staff tended to overlook the importance of answering calls. A nurse participant tended to perceive calls as more likely requiring nursing staff's attention than a nurse aide participant.</p> <p>Conclusions</p> <p>If answering calls was a high priority among nursing tasks, staff would perceive calls as being important, requiring nursing staff's attention, and being meaningful. Therefore, answering calls should not be perceived as preventing staff from doing the critical aspects of their role. Additional efforts are necessary to reach the ideal or even a reasonable level of patient safety-first practice in current hospital environments.</p

    The Spectral Energy Distribution of Fermi bright blazars

    Full text link
    (Abridged) We have conducted a detailed investigation of the broad-band spectral properties of the \gamma-ray selected blazars of the Fermi LAT Bright AGN Sample (LBAS). By combining our accurately estimated Fermi gamma-ray spectra with Swift, radio, infra-red, optical and other hard X-ray/gamma-ray data, collected within three months of the LBAS data taking period, we were able to assemble high-quality and quasi-simultaneous Spectral Energy Distributions (SED) for 48 LBAS blazars.The SED of these gamma-ray sources is similar to that of blazars discovered at other wavelengths, clearly showing, in the usual Log ν\nu - Log ν\nu Fν_\nu representation, the typical broad-band spectral signatures normally attributed to a combination of low-energy synchrotron radiation followed by inverse Compton emission of one or more components. We have used these SEDs to characterize the peak intensity of both the low and the high-energy components. The results have been used to derive empirical relationships that estimate the position of the two peaks from the broad-band colors (i.e. the radio to optical and optical to X-ray spectral slopes) and from the gamma-ray spectral index. Our data show that the synchrotron peak frequency νpS\nu_p^S is positioned between 1012.5^{12.5} and 1014.5^{14.5} Hz in broad-lined FSRQs and between 101310^{13} and 101710^{17} Hz in featureless BL Lacertae objects.We find that the gamma-ray spectral slope is strongly correlated with the synchrotron peak energy and with the X-ray spectral index, as expected at first order in synchrotron - inverse Compton scenarios. However, simple homogeneous, one-zone, Synchrotron Self Compton (SSC) models cannot explain most of our SEDs, especially in the case of FSRQs and low energy peaked (LBL) BL Lacs. (...)Comment: 85 pages, 38 figures, submitted to Ap

    Tangible Distributed Computer Music for Youth

    Get PDF
    Computer music research realizes a vision of performance by means of computational expression, linking body and space to sound and imagery through eclectic forms of sensing and interaction. This vision could dramatically impact computer science education, simultaneously modernizing the field and drawing in diverse new participants. In this article, we describe our work creating an interactive computer music toolkit for kids called BlockyTalky. This toolkit enables users to create networks of sensing devices and synthesizers, and to program the musical and interactive behaviors of these devices. We also describe our work with two middle school teachers to co-design and deploy a curriculum for 11- to 13-year-old students. We draw on work with these students to evidence how computer music can support learning about computer science concepts and change students’ perceptions of computing. We conclude by outlining some remaining questions around how computer music and computer science may best be linked to provide transformative educational experiences
    • …
    corecore