5,756 research outputs found

    Walker Community Library: Through Interaction Comes the Spread of Knowledge

    Get PDF
    This thesis study will explore how the redevelopment of a branch library in a vibrant area of a city can not only strengthen the urban fabric but also breed interaction and facilitate the spread of knowledge though discussion along with the collection of information already within the library. The WALKER COMMUNITY LIBRARY located in Uptown, MN, will be redeveloped from its underground state to embrace the 3D nature of the city, housing functions both above and below street level. The proposed library will change from its original state of approximately 20,000 s.f. to almost 30,000 s.f

    Mirror, Mirror on the Wall: An International Update on the Comparative Performance of American Health Care

    Get PDF
    Using data from patient and physician surveys, finds that the U.S. healthcare system ranks last or next-to-last when comparing healthcare performance against that of five other nations -- Australia, Canada, Germany, New Zealand, the United Kingdom

    Barriers to Hospital Electronic Public Health Reporting and Implications for the COVID-19 Pandemic

    Get PDF
    We sought to identify barriers to hospital reporting of electronic surveillance data to local, state, and federal public health agencies and the impact on areas projected to be overwhelmed by the COVID-19 pandemic. Using 2018 American Hospital Association data, we identified barriers to surveillance data reporting and combined this with data on the projected impact of the COVID-19 pandemic on hospital capacity at the hospital referral region level. Our results find the most common barrier was public health agencies lacked the capacity to electronically receive data, with 41.2% of all hospitals reporting it. We also identified 31 hospital referral regions in the top quartile of projected bed capacity needed for COVID-19 patients in which over half of hospitals in the area reported that the relevant public health agency was unable to receive electronic data. Public health agencies’ inability to receive electronic data is the most prominent hospital-reported barrier to effective syndromic surveillance. This reflects the policy commitment of investing in information technology for hospitals without a concomitant investment in IT infrastructure for state and local public health agencies

    Cutting edges at random in large recursive trees

    Get PDF
    We comment on old and new results related to the destruction of a random recursive tree (RRT), in which its edges are cut one after the other in a uniform random order. In particular, we study the number of steps needed to isolate or disconnect certain distinguished vertices when the size of the tree tends to infinity. New probabilistic explanations are given in terms of the so-called cut-tree and the tree of component sizes, which both encode different aspects of the destruction process. Finally, we establish the connection to Bernoulli bond percolation on large RRT's and present recent results on the cluster sizes in the supercritical regime.Comment: 29 pages, 3 figure

    Defects and boundary layers in non-Euclidean plates

    Full text link
    We investigate the behavior of non-Euclidean plates with constant negative Gaussian curvature using the F\"oppl-von K\'arm\'an reduced theory of elasticity. Motivated by recent experimental results, we focus on annuli with a periodic profile. We prove rigorous upper and lower bounds for the elastic energy that scales like the thickness squared. In particular we show that are only two types of global minimizers -- deformations that remain flat and saddle shaped deformations with isolated regions of stretching near the edge of the annulus. We also show that there exist local minimizers with a periodic profile that have additional boundary layers near their lines of inflection. These additional boundary layers are a new phenomenon in thin elastic sheets and are necessary to regularize jump discontinuities in the azimuthal curvature across lines of inflection. We rigorously derive scaling laws for the width of these boundary layers as a function of the thickness of the sheet

    Development and validation of a method to estimate COPD severity in multiple datasets: a retrospective study

    Get PDF
    Introduction Outcomes in chronic obstructive pulmonary disease (COPD) such as symptoms, hospitalisations and mortality rise with increasing disease severity. However, the heterogeneity of electronic medical records presents a significant challenge in measuring severity across geographies. We aimed to develop and validate a method to approximate COPD severity using the Global Initiative for Chronic Obstructive Lung Disease (GOLD) 2011 classification scheme, which categorises patients based on forced expiratory volume in 1 s, hospitalisations and the modified Medical Research Council dyspnoea scale or COPD Assessment Test. Methods This analysis was part of a comprehensive retrospective study, including patients sourced from the IQVIA Medical Research Data [IMRD; incorporating data from The Health Improvement Network (THIN), a Cegedim database] and the Clinical Practice Research Datalink (CPRD) in the UK, the Disease Analyzer in Germany and the Longitudinal Patient Data in Italy, France and Australia. Patients in the CPRD with the complete set of information required to calculate GOLD 2011 groups were used to develop the method. Ordinal logistic models at COPD diagnosis and at index (first episode of triple therapy) were then used to validate the method to estimate COPD severity, and this was applied to the full study population to estimate GOLD 2011 categories. Results Overall, 4579 and 12,539 patients were included in the model at COPD diagnosis and at index, respectively. Models correctly classified 74.4% and 75.9% of patients into severe and non-severe categories at COPD diagnosis and at index, respectively. Age, gender, time between diagnosis and start of triple therapy, healthcare resource use, comorbid conditions and prescriptions were included as covariates. Conclusion This study developed and validated a method to approximate disease severity based on GOLD 2011 categories that can potentially be used in patients without all the key parameters needed for this calculation

    Trigger, an active release experiment that stimulated auroral particle precipitation and wave emissions

    Get PDF
    The experiment design, including a description of the diagnostic and chemical release payload, and the general results are given for an auroral process simulation experiment. A drastic increase of the field aligned charged particle flux was observed over the approximate energy range 10 eV to more than 300 keV, starting about 150 ms after the release and lasting about one second. The is evidence of a second particle burst, starting one second after the release and lasting for tens of seconds, and evidence for a periodic train of particle bursts occurring with a 7.7 second period from 40 to 130 seconds after the release. A transient electric field pulse of 200 mv/m appeared just before the particle flux increase started. Electrostatic wave emissions around 2 kHz, as well as a delayed perturbation of the E-region below the plasma cloud were also observed. Some of the particle observations are interpreted in terms of field aligned electrostatic acceleration a few hundred kilometers above the injected plasma cloud. It is suggested that the acceleration electric field was created by an instability driven by field aligned currents originating in the plasma cloud

    Solving Lattice QCD systems of equations using mixed precision solvers on GPUs

    Full text link
    Modern graphics hardware is designed for highly parallel numerical tasks and promises significant cost and performance benefits for many scientific applications. One such application is lattice quantum chromodyamics (lattice QCD), where the main computational challenge is to efficiently solve the discretized Dirac equation in the presence of an SU(3) gauge field. Using NVIDIA's CUDA platform we have implemented a Wilson-Dirac sparse matrix-vector product that performs at up to 40 Gflops, 135 Gflops and 212 Gflops for double, single and half precision respectively on NVIDIA's GeForce GTX 280 GPU. We have developed a new mixed precision approach for Krylov solvers using reliable updates which allows for full double precision accuracy while using only single or half precision arithmetic for the bulk of the computation. The resulting BiCGstab and CG solvers run in excess of 100 Gflops and, in terms of iterations until convergence, perform better than the usual defect-correction approach for mixed precision.Comment: 30 pages, 7 figure
    • …
    corecore