4,421 research outputs found
Improving time intervals of blood glucose levels and insulin : the best practice bundle
Nature and scope of the project: Proper diabetes management for patients in a hospital setting is critical due to their high propensity for co-morbidities (Kansas Department of Health and Environment, 2019). Dosing insulin off real-time blood glucose levels prevents hypo and hyperglycemic events (ADA, 2024). The purpose of this project is to decrease the time interval between the point-of-care blood glucose (POC-BG) and insulin administration on hospitalized patients by following a nursing bundle. The primary outcome variable was insulin administrations [less than or equal to] 15 minutes from the POC-BG. Synthesis and analysis of supporting literature: The American Diabetes Association recommends hospitalized receive their blood glucose check, insulin administration, and coordinated with their meal when applicable. A nurse driven bundle was developed to improve the timing between the POC-BG and insulin administration. Project implementation: To increase compliance with the best practice bundle, the existing bundle was modified and reintroduced in December 2023 and POC-BG to insulin administration time intervals were measured in T1- January 2024 and T2- February 2024. Evaluation criteria: Pre-intervention compliance rates were measured, interventions were completed, and evaluation of the POC-BG to insulin administration times were measured with a goal of 75% being [less than or equal to] 15 minutes in T1 and T2 on two separate units. Outcomes: There were 446 instances measured from 146 patients on the two units over T1 and T2. The CCU increased their compliance from pre-intervention level of 72.6% to T1-82.1%, and T2-86.2%. The primary objective was achieved on this unit. The HNVU increased their compliance from a pre-intervention level of 34.4% to T1-52.3% and T2-43.1%. Although their compliance has improved from baseline, they did not achieve the primary objective of 75%. Recommendations: This pilot project will continue at this site by expanding to other units. In the expansion, the data from this project will help guide the implementation of this bundle by recognizing target areas that are prone to not meeting goals. Research should also continue, when possible, in evaluating the number of hypo and hyperglycemic events in the pre and post pilot periods to measure if the bundle does affect glucose variability
Stabilization, pointing and command control of a balloon-borne 1-meter telescope
A 1-meter balloon-borne telescope has been constructed and flown to observe far-infrared radiation from celestial sources. The attitude control systems must perform to the diffraction limit of the telescope for stabilization and have positioning capability for source acquisition. These and associated systems are discussed in detail, as is the command control of the payload as a whole
Qualitative approaches to research using identity process theory
Qualitative research in psychology has had an interesting history over the last couple of decades in terms of its development, standing and popularity (see Howitt, 2010) but its story varies across domains of the discipline and across geographical locations. Social, health and counselling psychology in Europe (particularly in the UK) have been notably open to qualitative work, whereas, with some exceptions, qualitative approaches to psychological research have struggled to make a major impression in North American psychology generally. In places where it has become relatively established, the story of qualitative approaches to psychological research has not been marked by a cumulative upward trajectory of popularity. Even in the UK, for example, where qualitative methods became an increasingly standard presence in psychology degree programs in the 1990s, there may have been a flattening in popularity in recent years associated with a changing research culture and the ascendancy of cognitive neuroscience as a powerful domain within psychology. In the time since its original, most complete presentation within British social psychology (Breakwell, 1986), Identity Process Theory (IPT) has been employed in both quantitative and qualitative research. In this chapter, we examine the contributions that qualitative research located within an IPT framework can make to the understanding of identity and of the theory itself, while also noting some of the challenges associated with using qualitative approaches within IPT research. In parallel with Vivian L. Vignoles in his chapter on quantitative approaches to IPT research in this volume, we want to make it clear that our chapter should not be seen as suggesting that qualitative research methods are inherently superior to quantitative approaches for studying identity from an IPT perspective. Mindful of critical questions that have been raised about the role and value of qualitative research in the social sciences (e.g. Hammersley, 2008), we advocate a pragmatic approach to methodology. The question is always which research approach – singly or in combination with others – is most useful for achieving the aims and answering the research question of any given study and for maximizing the value of the research, however “research value” might be defined. We agree that some research aims are best suited to quantitative approaches, such as testing theoretical predictions, and other research aims are best achieved through qualitative approaches, such as developing rich, contextualized understandings of phenomena
Analysis and Implementation of Median Type Filters
Median filters are a special class of ranked order filters used for smoothing signals. These filters have achieved- success in speech processing, image processing, and other impulsive noise environments where linear filters have proven inadequate. Although the implementation of a median filter requires only a simple digital operation, its properties are not easily analyzed. Even so, a number of properties have been exhibited in the literature. In this thesis, a new tool, known as threshold decomposition is introduced for the analysis and implementation of median type filters. This decomposition of multi-level signals into sets of binary signals has led to significant theoretical and practical breakthroughs in the area of median filters. A preliminary discussion on using the threshold decomposition as an algorithm for a fast and parallel VLSI Circuit implementation of ranked filters is also presented* In addition, the theory is developed both for determining the number of signals which are invariant to arbitrary window width median filters when any number of quantization levels are allowed and for counting or estimating the number of passes required to produce a root- i.e. invariant signal, for binary signals. Finally, the analog median filter is defined and proposed for analysis of the standard discrete median filter in cases with a large sample size or when the associated statistics would be simpler in the continuu
The impact of neutrino-nucleus interaction modeling on new physics searches
Accurate neutrino-nucleus interaction modeling is an essential requirement
for the success of the accelerator-based neutrino program. As no satisfactory
description of cross sections exists, experiments tune neutrino-nucleus
interactions to data to mitigate mis-modeling. In this work, we study how the
interplay between near detector tuning and cross section mis-modeling affects
new physics searches. We perform a realistic simulation of neutrino events and
closely follow NOvA's tuning, the first published of such procedures in a
neutrino experiment. We analyze two illustrative new physics scenarios, sterile
neutrinos and light neutrinophilic scalars, presenting the relevant
experimental signatures and the sensitivity regions with and without tuning.
While the tuning does not wash out sterile neutrino oscillation patterns, cross
section mis-modeling can bias the experimental sensitivity. In the case of
light neutrinophilic scalars, variations in cross section models completely
dominate the sensitivity regardless of any tuning. Our findings reveal the
critical need to improve our theoretical understanding of neutrino-nucleus
interactions, and to estimate the impact of tuning on new physics searches. We
urge neutrino experiments to follow NOvA's example and publish the details of
their tuning procedure, and to develop strategies to more robustly account for
cross section uncertainties, which will expand the scope of their physics
program
Operation of EMEP ‘supersites’ in the United Kingdom. Annual report for 2008.
As part of its commitment to the UN-ECE Convention on Long-range Transboundary Air Pollution the United Kingdom operates two ‘supersites’ reporting data to the Co-operative Programme for Monitoring and Evaluation of the Long-range Transmission of Air Pollutants in Europe (EMEP).
This report provides the annual summary for 2008, the second full calendar year of operation of the first EMEP ‘supersite’ to be established in the United Kingdom. Detailed operational reports have been submitted to Defra every 3 months, with unratified data. This annual report contains a summary of the ratified data for 2008.
The EMEP ‘supersite’ is located in central southern Scotland at Auchencorth (3.2oW, 55.8oN), a remote rural moorland site ~20 km south-west of Edinburgh. Monitoring operations started formally on 1 June 2006.
In addition to measurements made specifically under this contract, the Centre for Ecology & Hydrology also acts as local site operator for measurements made under other UK monitoring networks: the Automated Urban and Rural Network (AURN), the UK Eutrophication and Acidification Network (UKEAP), the UK Hydrocarbons Network, and the UK Heavy Metals Rural Network. Some measurements were also made under the auspices of the ‘Air Pollution Deposition Processes’ contract. All these associated networks are funded by Defra.
This report summarises the measurements made between January and December 2008, and presents summary statistics on average concentrations.
The site is dominated by winds from the south-west, but wind direction data highlight potential sources of airborne pollutants (power stations, conurbations).
The average diurnal patterns of gases and particles are consistent with those expected for a remote rural site.
The frequency distributions are presented for data where there was good data capture throughout the whole period. Some components (e.g. black carbon) show log-normal frequency distributions, while other components (e.g. ozone) have more nearly normal frequency distributions.
A case study is presented for a period in June 2008, showing the influence of regional air pollutants at this remote rural site.
All the data reported under the contract are shown graphically in the Appendix
FC-GAGA: Fully Connected Gated Graph Architecture for Spatio-Temporal Traffic Forecasting
Forecasting of multivariate time-series is an important problem that has
applications in traffic management, cellular network configuration, and
quantitative finance. A special case of the problem arises when there is a
graph available that captures the relationships between the time-series. In
this paper we propose a novel learning architecture that achieves performance
competitive with or better than the best existing algorithms, without requiring
knowledge of the graph. The key element of our proposed architecture is the
learnable fully connected hard graph gating mechanism that enables the use of
the state-of-the-art and highly computationally efficient fully connected
time-series forecasting architecture in traffic forecasting applications.
Experimental results for two public traffic network datasets illustrate the
value of our approach, and ablation studies confirm the importance of each
element of the architecture. The code is available here:
https://github.com/boreshkinai/fc-gaga
Recent Decisions
Comments on recent decisions by Donald L. Very, James Carroll Booth, William E. Coyle, Edward N. Denn, William C. Rindone, Jr., and Karl Jorda
- …