3,475 research outputs found

    Long-term use of motion-based video games in care home settings

    Get PDF
    Recent research suggests that motion-based video games have the potential to provide both mental and physical stimulation for older adults in residential care. However, little research has explored the practical challenges and opportunities that arise from integrating these games within existing schedules of activities in these contexts. In our work, we report on a qualitative enquiry that was conducted over a three month period at two long-term care facilities. Findings suggest that older adults enjoyed playing video games, and that games can be a valuable means of re-introducing challenge in late life, but that the impact of age-related changes and impairment can influence people’s ability to engage with games in a group setting. We outline core challenges in the design for care context and discuss implications of our work regarding the suitability of games as a self-directed leisure activity

    An analysis of the return on investment of Navy Enterprise Resource Planning as implemented Navy-wide FY04-FY15

    Get PDF
    MBA Professional ReportSince 2003, the United States Navy has invested hundreds of millions of dollars into the Enterprise Resource Planning (ERP) Program. ERP evolved from four pilot programs into a single solution. Furthermore, the Navy has invested approximately 2 billion dollars for ERP implementation and developed several programs to streamline the financial reporting practices. This thesis project analyzes the evolution and development of ERP, identifies the Navy's projections for ERP, and calculates the cost and benefits of executing ERP between FY04 and FY15. We compare the return on investment (ROI) on Navy ERP to the ROI from ERP implementation in the private sector. Our objective is to understand the ROI for the Navy ERP compared to the ROI for the private sector.http://archive.org/details/annalysisofretur1094510758US Navy (USN) author

    Subduction Duration and Slab Dip

    Get PDF
    The dip angles of slabs are among the clearest characteristics of subduction zones, but the factors that control them remain obscure. Here, slab dip angles and subduction parameters, including subduction duration, the nature of the overriding plate, slab age, and convergence rate, are determined for 153 transects along subduction zones for the present day. We present a comprehensive tabulation of subduction duration based on isotopic ages of arc initiation and stratigraphic, structural, plate tectonic and seismic indicators of subduction initiation. We present two ages for subduction zones, a long‐term age and a reinitiation age. Using cross correlation and multivariate regression, we find that (1) subduction duration is the primary parameter controlling slab dips with slabs tending to have shallower dips at subduction zones that have been in existence longer; (2) the long‐term age of subduction duration better explains variation of shallow dip than reinitiation age; (3) overriding plate nature could influence shallow dip angle, where slabs below continents tend to have shallower dips; (4) slab age contributes to slab dip, with younger slabs having steeper shallow dips; and (5) the relations between slab dip and subduction parameters are depth dependent, where the ability of subduction duration and overriding plate nature to explain observed variation decreases with depth. The analysis emphasizes the importance of subduction history and the long‐term regional state of a subduction zone in determining slab dip and is consistent with mechanical models of subduction

    Inducer dynamics full-flow, full-admission hydraulic turbine drive Interim report for tasks 1, 2, and 3

    Get PDF
    Hydrodynamical and mechanical design layout for two-speed hydraulic turbine inducer, computer simulation of pumping system and test facility performance, and study of demonstration uni

    Tests of Bayesian Model Selection Techniques for Gravitational Wave Astronomy

    Full text link
    The analysis of gravitational wave data involves many model selection problems. The most important example is the detection problem of selecting between the data being consistent with instrument noise alone, or instrument noise and a gravitational wave signal. The analysis of data from ground based gravitational wave detectors is mostly conducted using classical statistics, and methods such as the Neyman-Pearson criteria are used for model selection. Future space based detectors, such as the \emph{Laser Interferometer Space Antenna} (LISA), are expected to produced rich data streams containing the signals from many millions of sources. Determining the number of sources that are resolvable, and the most appropriate description of each source poses a challenging model selection problem that may best be addressed in a Bayesian framework. An important class of LISA sources are the millions of low-mass binary systems within our own galaxy, tens of thousands of which will be detectable. Not only are the number of sources unknown, but so are the number of parameters required to model the waveforms. For example, a significant subset of the resolvable galactic binaries will exhibit orbital frequency evolution, while a smaller number will have measurable eccentricity. In the Bayesian approach to model selection one needs to compute the Bayes factor between competing models. Here we explore various methods for computing Bayes factors in the context of determining which galactic binaries have measurable frequency evolution. The methods explored include a Reverse Jump Markov Chain Monte Carlo (RJMCMC) algorithm, Savage-Dickie density ratios, the Schwarz-Bayes Information Criterion (BIC), and the Laplace approximation to the model evidence. We find good agreement between all of the approaches.Comment: 11 pages, 6 figure

    Development of European standards for evaluative reporting in forensic science : The gap between intentions and perceptions

    Get PDF
    Criminal justice authorities of EU countries currently engage in dialogue and action to build a common area of justice and to help increase the mutual trust in judicial systems across Europe. This includes, for example, the strengthening of procedural safeguards for citizens in criminal proceedings by promoting principles such as equality of arms. Improving the smooth functioning of judicial processes is also pursued by works of expert working groups in the field of forensic science, such as the working parties under the auspices of the European Network of Forensic Science Institutes (ENFSI). This network aims to share knowledge, exchange experiences and come to mutual agreements in matters concerning forensic science practice, among them the interpretation of results of forensic examinations. For example, through its Monopoly Programmes (financially supported by the European Commission), ENFSI has funded a series of projects that come under the general theme ‘Strengthening the Evaluation of Forensic Results across Europe’. Although these initiatives reflect a strong commitment to mutual understanding on general principles of forensic interpretation, the development of standards for evaluation and reporting, including roadmaps for implementation within the ENFSI community, are fraught with conceptual and practical hurdles. In particular, experience through consultations with forensic science practitioners shows that there is a considerable gap between the intentions of a harmonised view on principles of forensic interpretation and the way in which works towards such common understanding are perceived in the community. In this paper, we will review and discuss several recurrently raised concerns. We acknowledge practical constraints such as limited resources for training and education, but we shall also argue that addressing topics in forensic interpretation now is of vital importance because forensic science continues to be challenged by proactive participants in the legal process that tend to become more demanding and less forgiving

    Geometric View of Measurement Errors

    Get PDF
    The slope of the best fit line from minimizing the sum of the squared oblique errors is the root of a polynomial of degree four. This geometric view of measurement errors is used to give insight into the performance of various slope estimators for the measurement error model including an adjusted fourth moment estimator introduced by Gillard and Iles (2005) to remove the jump discontinuity in the estimator of Copas (1972). The polynomial of degree four is associated with a minimun deviation estimator. A simulation study compares these estimators showing improvement in bias and mean squared error

    Overeducation across British regions

    Get PDF
    This paper analyses levels of over-education and wage returns to education for males across eleven regions of the UK using Labour Force Survey data. Significant differences are found in the probability of being over-educated across regions; also, differences are found in the return to the ‘correct’ level of education in each region, in each case associated with flexibility of movement between and into particular regions, which determines the ease of job matching. Furthermore, evidence is found that, after controlling for the level of education acquired, there exists a premium to the ‘correct’ level of education, which varies across UK regions
    corecore