3,205 research outputs found

    Power and Sample Size Estimation for Nonparametric Composite Endpoints: Practical Implementation using Data Simulations

    Get PDF
    Composite endpoints are a popular outcome in controlled studies. However, the required sample size is not easily obtained due to the assortment of outcomes, correlations between them and the way in which the composite is constructed. Data simulations are required. A macro is developed that enables sample size and power estimation

    The combined effects of vertical and horizontal shear instabilities

    Full text link
    Shear instabilities can be the source of significant amounts of turbulent mixing in stellar radiative zones. Past attempts at modeling their effects (either theoretically or using numerical simulations) have focused on idealized geometries where the shear is either purely vertical or purely horizontal. In stars, however, the shear can have arbitrary directions with respect to gravity. In this work, we use direct numerical simulations to investigate the nonlinear saturation of shear instabilities in a stably stratified fluid, where the shear is sinusoidal in the horizontal direction, and either constant or sinusoidal in the vertical direction. We find that, in the parameter regime studied here (non-diffusive, fully turbulent flow), the mean vertical shear does not play any role in controlling the dynamics of the resulting turbulence unless its Richardson number is smaller than one (approximately). As most stellar radiative regions have a Richardson number much greater than one, our result implies that the vertical shear can essentially be ignored in the computation of the vertical mixing coefficient associated with shear instabilities for the purpose of stellar evolution calculations, even when it is much larger than the horizontal shear (as in the solar tachocline, for instance).Comment: 26 pages, 8 figures, resubmitted to Ap

    Whole family-based physical activity promotion intervention: the Families Reporting Every Step to Health pilot randomised controlled trial protocol

    Get PDF
    Introduction : Family-based physical activity (PA) interventions present a promising avenue to promote children’s activity, however, high-quality experimental research is lacking. This paper describes the protocol for the FRESH (Families Reporting Every Step to Health) pilot trial, a child-led family-based PA intervention delivered online.  Methods and analysis : FRESH is a three-armed, parallel-group, randomised controlled pilot trial using a 1:1:1 allocation ratio with follow-up assessments at 8- and 52-weeks post-baseline. Families will be eligible if a minimum of one child in school Years 3-6 (aged 7-11 years) and at least one adult responsible for that child are willing to participate. Family members can take part in the intervention irrespective of their participation in the accompanying evaluation and vice versa. Following baseline assessment, families will be randomly allocated to one of three arms: (1) FRESH, (2) pedometer-only, or (3) no-intervention control. All family members in the pedometer-only and FRESH arms receive pedometers and generic PA promotion information. FRESH families additionally receive access to the intervention website; allowing participants to select step challenges to ‘travel’ to target cities around the world, log steps, and track progress as they virtually globetrot. Control families will receive no treatment. All family members will be eligible to participate in the evaluation with two follow-ups (8 and 52 weeks). Physical (e.g., fitness, blood pressure), psychosocial (e.g., social support), and behavioural (e.g., objectively-measured family PA) measures will be collected each time point. At 8-week follow-up, a mixed-methods process evaluation will be conducted (questionnaires and family focus groups) assessing acceptability of the intervention and evaluation. FRESH families’ website engagement will also be explored.  Ethics and dissemination : This study received ethical approval from the Ethics Committee for the School of the Humanities and Social Sciences at the University of Cambridge. Findings will be disseminated via peer-reviewed publications, conferences, and to participating families

    Symmetry improvement of 3PI effective actions for O(N) scalar field theory

    Full text link
    [Abridged] n-Particle Irreducible Effective Actions (nnPIEA) are a powerful tool for extracting non-perturbative and non-equilibrium physics from quantum field theories. Unfortunately, practical truncations of nnPIEA can unphysically violate symmetries. Pilaftsis and Teresi (PT) addressed this by introducing a "symmetry improvement" scheme in the context of the 2PIEA for an O(2) scalar theory, ensuring that the Goldstone boson is massless in the broken symmetry phase [A. Pilaftsis and D. Teresi, Nuc.Phys. B 874, 2 (2013), pp. 594--619]. We extend this by introducing a symmetry improved 3PIEA for O(N) theories, for which the basic variables are the 1-, 2- and 3-point correlation functions. This requires the imposition of a Ward identity involving the 3-point function. The method leads to an infinity of physically distinct schemes, though an analogue of d'Alembert's principle is used to single out a unique scheme. The standard equivalence hierarchy of nnPIEA no longer holds with symmetry improvement and we investigate the difference between the symmetry improved 3PIEA and 2PIEA. We present renormalized equations of motion and counter-terms for 2 and 3 loop truncations of the effective action, leaving their numerical solution to future work. We solve the Hartree-Fock approximation and find that our method achieves a middle ground between the unimproved 2PIEA and PT methods. The phase transition predicted by our method is weakly first order and the Goldstone theorem is satisfied. We also show that, in contrast to PT, the symmetry improved 3PIEA at 2 loops does not predict the correct Higgs decay rate, but does at 3 loops. These results suggest that symmetry improvement should not be applied to nnPIEA truncated to <n<n loops. We also show that symmetry improvement is compatible with the Coleman-Mermin-Wagner theorem, a check on the consistency of the formalism.Comment: 27 pages, 15 figures, 2 supplemental Mathematica notebooks. REVTeX 4.1 with amsmath. Updated with minor corrections. Accepted for publication in Phys. Rev.

    Investigating the accuracy of parallel analysis in underextraction conditions: A monte carlo study

    Get PDF
    One of the most important decisions to make when performing an exploratory factor analysis regards the number of factors to retain. Parallel analysis is considered to be the best course of action in these circumstances as it consistently outperforms other factor extraction methods (Zwick & Velicer, 1986). Even so, parallel analysis requires further research and refinement to improve its accuracy. Characteristics such as factor loadings, correlations between factors, and number of variables per factor all have been shown to adversely impact the effectiveness of parallel analysis as a means of identifying the number of factors. Critically, even the choice of criteria on which to evaluate factors such as the eigenvalue at the 50th or 95th percentile can have deleterious effects on the number of factors extracted. One area of parallel analysis yet to be researched is the magnitude of the difference between the actual eigenvalue and the random data-based eigenvalue. Currently, even if the margin between the actual eigenvalue and the random data-based eigenvalue is nominal, the factor is considered to be meaningful. As such, it may behoove researchers to enforce a higher standard, such as a greater margin between the two eigenvalues than just an absolute difference. Accordingly, the purpose of this study will be to evaluate the efficacy of a 10 percent margin criterion as compared to an absolute margin. These margins will specifically be evaluated in conjunction with the 50th, 90th, 95th, and 99th percentile eigenvalue criteria on a population correlation matrix which engenders underextraction. Previous research (Matsumoto & Brown, 2017) explored the same conditions on a population correlation matrix designed to cause overextraction. They found that the most stringent standard (99th percentile eigenvalue plus 10 percent margin) was the most accurate. For the present study however, we hypothesize that the most accurate results will be obtained from a standard less stringent than the 99th percentile eigenvalue plus 10 percent margin. This research has important implications for the scientific and practical application of psychometrics

    The mixed problem in Lipschitz domains with general decompositions of the boundary

    Full text link
    This paper continues the study of the mixed problem for the Laplacian. We consider a bounded Lipschitz domain Ω⊂Rn\Omega\subset \reals^n, n≥2n\geq2, with boundary that is decomposed as ∂Ω=D∪N\partial\Omega=D\cup N, DD and NN disjoint. We let Λ\Lambda denote the boundary of DD (relative to ∂Ω\partial\Omega) and impose conditions on the dimension and shape of Λ\Lambda and the sets NN and DD. Under these geometric criteria, we show that there exists p0>1p_0>1 depending on the domain Ω\Omega such that for pp in the interval (1,p0)(1,p_0), the mixed problem with Neumann data in the space Lp(N)L^p(N) and Dirichlet data in the Sobolev space W1,p(D)W^ {1,p}(D) has a unique solution with the non-tangential maximal function of the gradient of the solution in Lp(∂Ω)L^p(\partial\Omega). We also obtain results for p=1p=1 when the Dirichlet and Neumann data comes from Hardy spaces, and a result when the boundary data comes from weighted Sobolev spaces.Comment: 36 page

    A Systematic Review - The Effect of Hospice and Palliative Care

    Get PDF
    Many older adults nearing death experience unnecessarily invasive and costly healthcare treatments, often causing more harm than good. Hospice and palliative care interventions offer a possible solution to this problem by prioritizing high-quality and cost-effective care with a strong focus on comfort and satisfaction. The authors of this paper seek to answer the following question: Do hospice and palliative care interventions directed toward older adults at the end of life improve quality of life, cost of care, and satisfaction? This paper thoroughly reviews and critically appraises existing research related to the effect of hospice and palliative care directed toward older adults at the end of life. Twenty primary studies published between 2011 and 2016 were identified, reviewed, and critically evaluated in an effort to answer this question. The publications were diverse in objective, scope, and design, but all contributed to the conversation regarding this potential solution to substandard care for older adults at the end of life. Based on the existing evidence, the authors came to the following conclusion: hospice and palliative care interventions are associated with improved quality of life in five out of six measured areas, decreased cost of care, and high satisfaction for care recipients and providers alike. Ten recommendations for clinical practice and five recommendations for future research are discussed
    • …
    corecore