14,078 research outputs found

    Public Involvement in research within care homes: Benefits and challenges in the APPROACH Study

    Get PDF
    Public involvement in research (PIR) can improve research design and recruitment. Less is known about how PIR enhances the experience of participation and enriches the data collection process. In a study to evaluate how UK care homes and primary health care services achieve integrated working to promote older people’s health, PIR was integrated throughout the research processes. Objectives This paper aims to present one way in which PIR has been integrated into the design and delivery of a multi-site research study based in care homes. Design A prospective case study design, with an embedded qualitative evaluation of PIR activity. Setting and Participants Data collection was undertaken in six care homes in three sites in England. Six PIR members participated: all had prior personal or work experience in care homes. Data Collection Qualitative data collection involved discussion groups, and site-specific meetings to review experiences of participation, benefits and challenges, and completion of structured fieldwork notes after each care home visit. Results PIR members supported: recruitment, resident and staff interviews and participated in data interpretation. Benefits of PIR work were resident engagement that minimised distress and made best use of limited research resources. Challenges concerned communication and scheduling. Researcher support for PIR involvement was resource intensive. Discussion and Conclusions Clearly defined roles with identified training and support facilitated involvement in different aspectsPublic Involvement in Research members of the research team: Gail Capstick, Marion Cowie, Derek Hope, Rita Hewitt, Alex Mendoza, John Willmott. Also the involvement of Steven Iliffe and Heather Gag

    Reflective Ghost Imaging through Turbulence

    Full text link
    Recent work has indicated that ghost imaging may have applications in standoff sensing. However, most theoretical work has addressed transmission-based ghost imaging. To be a viable remote-sensing system, the ghost imager needs to image rough-surfaced targets in reflection through long, turbulent optical paths. We develop, within a Gaussian-state framework, expressions for the spatial resolution, image contrast, and signal-to-noise ratio of such a system. We consider rough-surfaced targets that create fully developed speckle in their returns, and Kolmogorov-spectrum turbulence that is uniformly distributed along all propagation paths. We address both classical and nonclassical optical sources, as well as a computational ghost imager.Comment: 13 pages, 3 figure

    An efficient minimum-distance decoding algorithm for convolutional error-correcting codes

    Get PDF
    Minimum-distance decoding of convolutional codes has generally been considered impractical for other than relatively short constraint length codes, because of the exponential growth in complexity with increasing constraint length. The minimum-distance decoding algorithm proposed in the paper, however, uses a sequential decoding approach to avoid an exponential growth in complexity with increasing constraint length, and also utilises the distance and structural properties of convolutional codes to considerably reduce the amount of tree searching needed to find the minimum-distance path. In this way the algorithm achieves a complexity that does not grow exponentially with increasing constraint length, and is efficient for both long and short constraint length codes. The algorithm consists of two main processes. Firstly, a direct-mapping scheme, which automatically finds the minimum-distance path in a single mapping operation, is used to eliminate the need for all short back-up tree searches. Secondly, when a longer back-up search is required, an efficient tree-searching scheme is used to minimise the required search effort. The paper describes the complete algorithm and its theoretical basis, and examples of its operation are given

    An Extinction Study of the Taurus Dark Cloud Complex

    Get PDF
    We present a study of the detailed distribution of extinction in a region of the Taurus dark cloud complex. Our study uses new BVR images of the region, spectral classification data for 95 stars, and IRAS Sky Survey Atlas (ISSA) 60 and 100 micron images. We study the extinction of the region in four different ways, and we present the first inter-comparison of all these methods, which are: 1) using the color excess of background stars for which spectral types are known; 2) using the ISSA 60 and 100 micron images; 3) using star counts; and 4) using an optical (V and R) version of the average color excess method used by Lada et al. (1994). We find that all four methods give generally similar results, with important exceptions. To study the structure in the dust distribution, we compare the ISSA extinction and the extinction measured for individual stars. From the comparison, we conclude that in the relatively low extinction regions studied, with 0.9 < A_V < 3.0 mag (away from filamentary dark clouds and IRAS cores), there are no fluctuations in the dust column density greater than 45% (at the 99.7% confidence level), on scales smaller than 0.2 pc. We also report the discovery of a previously unknown stellar cluster behind the Taurus dark cloud near R.A 4h19m00s, Dec. 27:30:00 (B1950)Comment: 49 pages (which include 6 pages of tables and 6 pages of figures

    Analysis of the computational and storage requirements for the minimum-distance decoding of convolutional codes

    Get PDF
    In this paper we present the analytical results of the computational requirement for the minimum-distance decoding of convolutional codes. By deriving upper bounds for the number of decoding operations required to advance one code segment, we show that many less operations are required than in the case of sequential decoding This implies a significant reduction in the severity of the buffer-overflow problem. Then, we propose several modifications which could further reduce the computational effort required at long back-up distance. Finally we investigate the trade-off between coding-parameters selection and storage requirement as an aid to quantitative decoder design. Examples and future aspects are also presented and discussed

    Ion-neutral sympathetic cooling in a hybrid linear rf Paul and magneto-optical trap

    Full text link
    Long range polarization forces between ions and neutral atoms result in large elastic scattering cross sections, e.g., 10^6 a.u. for Na+ on Na or Ca+ on Na at cold and ultracold temperatures. This suggests that a hybrid ion-neutral trap should offer a general means for significant sympathetic cooling of atomic or molecular ions. We present SIMION 7.0 simulation results concerning the advantages and limitations of sympathetic cooling within a hybrid trap apparatus, consisting of a linear rf Paul trap concentric with a Na magneto-optical trap (MOT). This paper explores the impact of various heating mechanisms on the hybrid system and how parameters related to the MOT, Paul trap, number of ions, and ion species affect the efficiency of the sympathetic cooling

    Proton-neutron pairing in the deformed BCS approach

    Full text link
    We examine isovector and isoscalar proton-neutron pairing correlations for the ground state of even-even Ge isotopes with mass number A=64-76 within the deformed BCS approach. For N=Z 64Ge the BCS solution with only T=0 proton-neutron pairs is found. For other nuclear systems (N>Z) a coexistence of a T=0 and T=1 pairs in the BCS wave function is observed. A problem of fixing of strengths of isoscalar and isovector pairing interactions is addressed. A dependence of number of like and unlike pairs in the BCS ground state on the difference between number of neutrons and protons is discussed. We found that for nuclei with N much bigger than Z the effect of proton-neutron pairing is small but not negligible.Comment: 24 pages, 6 figure

    On the Relationship between Resolution Enhancement and Multiphoton Absorption Rate in Quantum Lithography

    Get PDF
    The proposal of quantum lithography [Boto et al., Phys. Rev. Lett. 85, 2733 (2000)] is studied via a rigorous formalism. It is shown that, contrary to Boto et al.'s heuristic claim, the multiphoton absorption rate of a ``NOON'' quantum state is actually lower than that of a classical state with otherwise identical parameters. The proof-of-concept experiment of quantum lithography [D'Angelo et al., Phys. Rev. Lett. 87, 013602 (2001)] is also analyzed in terms of the proposed formalism, and the experiment is shown to have a reduced multiphoton absorption rate in order to emulate quantum lithography accurately. Finally, quantum lithography by the use of a jointly Gaussian quantum state of light is investigated, in order to illustrate the trade-off between resolution enhancement and multiphoton absorption rate.Comment: 14 pages, 7 figures, submitted, v2: rewritten in response to referees' comments, v3: rewritten and extended, v4: accepted by Physical Review
    corecore