144,603 research outputs found

    The Reliability of Electromyographic Normalization Methods for Cycling Analyses

    Get PDF
    Electromyography (EMG) is normalized in relation to a reference maximum voluntary contraction (MVC) value. Different normalization techniques are available but the most reliable method for cycling movements is unknown. This study investigated the reliability of different normalization techniques for cycling analyses. Twenty‐five male cyclists (age 24.13 ± 2.79 years, body height 176.22 ± 4.87 cm and body mass 67.23 ± 4.19 kg, BMI = 21.70 ± 2.60 kg∙m‐1) performed different normalization procedures on two occasions, within the same testing session. The rectus femoris, biceps femoris, gastrocnemius and tibialis anterior muscles were examined. Participants performed isometric normalizations (IMVC) using an isokinetic dynamometer. Five minutes of submaximal cycling (180 W) were also undertaken, allowing the mean (DMA) and peak (PDA) activation from each muscle to serve as reference values. Finally, a 10 s cycling sprint (MxDA) trial was undertaken and the highest activation from each muscle was used as the reference value. Differences between reference EMG amplitude, as a function of normalization technique and time, were examined using repeated measures ANOVAs. The test‐retest reliability of each technique was also examined using linear regression, intraclass correlations and Cronbach’s alpha. The results showed that EMG amplitude differed significantly between normalization techniques for all muscles, with the IMVC and MxDA methods demonstrating the highest amplitudes. The highest levels of reliability were observed for the PDA technique for all muscles; therefore, our results support the utilization of this method for cycling analyses

    Genetic Normalization of Differentiating Aneuploid Human Embryos

    Get PDF
    Early embryogenesis involves a series of dynamic processes, many of which are currently not well described or understood. Aneuploidy and aneuploid mosaicism, a mixture of aneuploid and euploid cells within one embryo, in early embryonic development are principal causes of developmental failure.^1,2^ Here we show that human embryos demonstrate a significant rate of genetic correction of aneuploidy, or "genetic normalization" when cultured from the cleavage stage on day 3 (Cleavage) to the blastocyst stage on day 5 (Blastocyst) using routine in vitro fertilization (IVF) laboratory conditions. One hundred and twenty-six human Cleavage stage embryos were evaluated for clinically indicated preimplantation genetic screening (PGS). Sixty-four of these embryos were found to be aneuploid following Cleavage stage embryo biopsy and single nucleotide polymorphism (SNP) 23 chromosome molecular karyotype (microarray). Of these, 25 survived to the Blastocyst stage of development and repeat microarray evaluation was performed. The inner cell mass (ICM), containing cells destined to form the fetus, and the trophectoderm (TE), containing cells destined to form the placenta were evaluated. Sixteen of 25 embryos (64%) [95% CI: 44-80%] possessed diploid karyotypes in both the ICM and TE cell populations. An additional three Blastocyst stage embryos showed genetic correction of the TE but not the ICM and one Blastocyst stage embryo showed the reverse. Mosaicism (exceeding 5%), was not detected in any of the ICM and TE samples analyzed. Recognizing that genetic normalization may occur in developing human embryos has important implications for stem cell biology, preimplantation and developmental genetics, embryology, and reproductive medicine. 

1)Hassold, T. et al. A cytogenetic study of 1000 spontaneous abortions. Ann Hum Genet. 44, 151-78 (1980).
2)Menasha, J., Levy, B., Hirschhorn, K., & Kardon, N.B. Incidence and spectrum of chromosome abnormalities in spontaneous abortions: new insights from a 12-year study. Genet Med. 7, 251-63 (2005)

    Low-Cost Air Quality Monitoring Tools: From Research to Practice (A Workshop Summary).

    Get PDF
    In May 2017, a two-day workshop was held in Los Angeles (California, U.S.A.) to gather practitioners who work with low-cost sensors used to make air quality measurements. The community of practice included individuals from academia, industry, non-profit groups, community-based organizations, and regulatory agencies. The group gathered to share knowledge developed from a variety of pilot projects in hopes of advancing the collective knowledge about how best to use low-cost air quality sensors. Panel discussion topics included: (1) best practices for deployment and calibration of low-cost sensor systems, (2) data standardization efforts and database design, (3) advances in sensor calibration, data management, and data analysis and visualization, and (4) lessons learned from research/community partnerships to encourage purposeful use of sensors and create change/action. Panel discussions summarized knowledge advances and project successes while also highlighting the questions, unresolved issues, and technological limitations that still remain within the low-cost air quality sensor arena

    Laplacian Mixture Modeling for Network Analysis and Unsupervised Learning on Graphs

    Full text link
    Laplacian mixture models identify overlapping regions of influence in unlabeled graph and network data in a scalable and computationally efficient way, yielding useful low-dimensional representations. By combining Laplacian eigenspace and finite mixture modeling methods, they provide probabilistic or fuzzy dimensionality reductions or domain decompositions for a variety of input data types, including mixture distributions, feature vectors, and graphs or networks. Provable optimal recovery using the algorithm is analytically shown for a nontrivial class of cluster graphs. Heuristic approximations for scalable high-performance implementations are described and empirically tested. Connections to PageRank and community detection in network analysis demonstrate the wide applicability of this approach. The origins of fuzzy spectral methods, beginning with generalized heat or diffusion equations in physics, are reviewed and summarized. Comparisons to other dimensionality reduction and clustering methods for challenging unsupervised machine learning problems are also discussed.Comment: 13 figures, 35 reference

    A novel plasticity rule can explain the development of sensorimotor intelligence

    Full text link
    Grounding autonomous behavior in the nervous system is a fundamental challenge for neuroscience. In particular, the self-organized behavioral development provides more questions than answers. Are there special functional units for curiosity, motivation, and creativity? This paper argues that these features can be grounded in synaptic plasticity itself, without requiring any higher level constructs. We propose differential extrinsic plasticity (DEP) as a new synaptic rule for self-learning systems and apply it to a number of complex robotic systems as a test case. Without specifying any purpose or goal, seemingly purposeful and adaptive behavior is developed, displaying a certain level of sensorimotor intelligence. These surprising results require no system specific modifications of the DEP rule but arise rather from the underlying mechanism of spontaneous symmetry breaking due to the tight brain-body-environment coupling. The new synaptic rule is biologically plausible and it would be an interesting target for a neurobiolocal investigation. We also argue that this neuronal mechanism may have been a catalyst in natural evolution.Comment: 18 pages, 5 figures, 7 video

    From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument

    Get PDF
    <b>Background</b> Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field.<p></p> <b>Methods</b> A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals.<p></p> <b>Results</b> The developed instrument was pre-tested in two professional samples (N = 46; N = 231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts.<p></p> <b>Conclusions</b> To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study
    • 

    corecore