27 research outputs found

    Predicting Dropout From Organized Football:A Prospective 4-Year Study Among Adolescent and Young Adult Football Players

    Get PDF
    Previous studies have shown that enjoyment is one of the key predictors of dropout from organized sport, including organized football. However, prospective studies, particularly studies focused on long-term dropout, are largely lacking. Drawing on the basic principles of interdependence theory, in the present prospective study among 1,762 adolescent and young adult football players (27.1% women, mean age 17.74 years, SD = 1.35), we tested the predictive value of sport enjoyment, perceived alternatives, and restraining forces on football players' short-term (6 months) and long-term (4 years) dropout from organized football. As anticipated, the results of the logistic regression and follow-up analyses indicate that players' enjoyment was the main predictor of (short-term and long-term) dropout. In addition, relative to remainers, dropouts perceived more alternatives in terms of other sports, had fewer family members involved in their football club, and were older at the time they started playing organized football. We conclude that particularly measures aimed at enhancing sport enjoyment may prevent players from dropping out from organized football in both the short and long term. In addition, dropout rates may be reduced by attracting and engaging youth at a very young age (from 6 years), and their siblings, parents, and other family members as well

    Validating a double Gaussian source model for small proton fields in a commercial Monte-Carlo dose calculation engine

    Get PDF
    Purpose: The primary fluence of a proton pencil beam exiting the accelerator is enveloped by a region of secondaries, commonly called “spray”. Although small in magnitude, this spray may affect dose distributions in pencil beam scanning mode e.g., in the calculation of the small field output, if not modelled properly in a treatment planning system (TPS). The purpose of this study was to dosimetrically benchmark the Monte Carlo (MC) dose engine of the RayStation TPS (v.10A) in small proton fields and systematically compare single Gaussian (SG) and double Gaussian (DG) modeling of initial proton fluence providing a more accurate representation of the nozzle spray. Methods: The initial proton fluence distribution for SG/DG beam modeling was deduced from two-dimensional measurements in air with a scintillation screen with electronic readout. The DG model was either based on direct fits of the two Gaussians to the measured profiles, or by an iterative optimization procedure, which uses the measured profiles to mimic in-air scan-field factor (SF) measurements. To validate the DG beam models SFs, i.e. relative doses to a 10 × 10 cm2 field, were measured in water for three different initial proton energies (100MeV, 160MeV, 226.7MeV) and square field sizes from 1×1cm2 to 10×10cm2 using a small field ionization chamber (IBA CC01) and an IBA ProteusPlus system (universal nozzle). Furthermore, the dose to the center of spherical target volumes (diameters: 1cm to 10cm) was determined using the same small volume ionization chamber (IC). A comprehensive uncertainty analysis was performed, including estimates of influence factors typical for small field dosimetry deduced from a simple two-dimensional analytical model of the relative fluence distribution. Measurements were compared to the predictions of the RayStation TPS. Results: SFs deviated by more than 2% from TPS predictions in all fields <4×4cm2 with a maximum deviation of 5.8% for SG modeling. In contrast, deviations were smaller than 2% for all field-sizes and proton energies when using the directly fitted DG model. The optimized DG model performed similarly except for slightly larger deviations in the 1×1cm2 scan-fields. The uncertainty estimates showed a significant impact of pencil beam size variations (±5%) resulting in up to 5.0% standard uncertainty. The point doses within spherical irradiation volumes deviated from calculations by up to 3.3% for the SG model and 2.0% for the DG model. Conclusion: Properly representing nozzle spray in RayStation's MC-based dose engine using a DG beam model was found to reduce the deviation to measurements in small spherical targets to below 2%. A thorough uncertainty analysis shows a similar magnitude for the combined standard uncertainty of such measurements

    Underlying molecular mechanisms of DIO2 susceptibility in symptomatic osteoarthritis

    Get PDF
    Objectives: To investigate how the genetic susceptibility gene DIO2 confers risk to osteoarthritis (OA) onset in humans and to explore whether counteracting the deleterious effect could contribute to novel therapeutic approaches. Methods: Epigenetically regulated expression of DIO2 was explored by assessing methylation of positional CpG-dinucleotides and the respective DIO2 expression in OA-affected and macroscopically preserved articular cartilage from end-stage OA patients. In a human in vitro chondrogenesis model, we measured the effects when thyroid signalling during culturing was either enhanced (excess T3 or lentiviral induced DIO2 overexpression) or decreased (iopanoic acid). Results: OA-related changes in methylation at a specific CpG dinucleotide upstream of DIO2 caused significant upregulation of its expression (ß=4.96; p=0.0016). This effect was enhanced and appeared driven specifically by DIO2 rs225014 risk allele carriers (ß=5.58, p=0.0006). During in vitro chondrogenesis, DIO2 overexpression resulted in a significant reduced capacity of chondrocytes to deposit extracellular matrix (ECM) components, concurrent with significant induction of ECM degrading enzymes (ADAMTS5, MMP13) and markers of mineralisation (ALPL, COL1A1). Given their concurrent and significant upregulation of expression, this process is likely mediated via HIF-2a/RUNX2 signalling. In contrast, we showed that inhibiting deiodinases during in vitro chondrogenesis contributed to prolonged cartilage homeostasis as reflected by significant increased deposition of ECM components and attenuated upregulation of matrix degrading enzymes. Conclusions: Our findings show how genetic variation at DIO2 could confer risk to OA and raised the possibility that counteracting thyroid signalling may be a novel therapeutic approach

    A Formal Semantics for P-Code

    No full text
    Decompilation is currently a widely used tool in reverse engineering and exploit detection in binaries. Ghidra, developed by the National Security Agency, is one of the most popular decompilers. It decompiles binaries to high P-Code, from which the final decompilation output in C code is generated. Ghidra allows users to work with P-Code, so users can analyze the intermediate representation directly. Several projects make use of this to build tools that perform verification, decompilation, taint analysis and emulation, to name a few. P-Code lacks a formal semantics, and its documentation is limited. It has a notoriously subtle semantics, which makes it hard to do any sort of analysis on P-Code. We show that P-Code, as-is, cannot be given an executable semantics. In this paper, we augment P-Code and define a complete, executable, formal semantics for it. This is done by looking at the documentation and the decompilation results of binaries with known source code. The development of a formal P-Code semantics uncovered several issues in Ghidra, P-Code, and the documentation. We show that these issues affect projects that rely on Ghidra and P-Code. We evaluate the executability of our semantics by building a P-Code interpreter that directly uses our semantics. Our work uncovered several issues in Ghidra and allows Ghidra users to better leverage P-Code.</p

    Low-Level Reachability Analysis Based on Formal Logic

    No full text
    Reachability is an important problem in program analysis. Automatically being able to show that – and how – a certain state is reachable, can be used to detect bugs and vulnerabilities. Various research has focused on formalizing a program logic that connects preconditions to post-conditions in the context of reachability analysis, e.g., must+, Lisbon Triples, and Outcome Logic. Outcome Logic and its variants can be seen as an adaptation of Hoare Logic and Incorrectness Logic. In this paper, we aim to study 1.) how such a formal reachability logic can be used for automated precondition generation, and 2.) how it can be used to reason over low-level assembly code. Automated precondition generation for reachability logic enables us to find inputs that provably trigger an assertion (i.e., a post-condition). Motivation for focusing on low-level code is that low-level code accurately describes actual program behavior, can be targeted in cases where source code is unavailable, and allows reasoning over low-level properties like return pointer integrity. An implementation has been developed, and the entire system is proven to be sound and complete (the latter only in the absence of unresolved indirections) in the Isabelle/HOL theorem prover. Initial results are obtained on litmus tests and case studies. The results expose limitations: traversal may not terminate, and more scalability would require a compositional approach. However, the results show as well that precondition generation based on low-level reachability logic allows exposing bugs in low-level code.</p

    Universal vaccine based on ectodomain of matrix protein 2 of Influenza A: Fc receptors and alveolar macrophages mediate protection

    No full text
    The ectodomain of matrix protein 2 (M2e) of influenza A virus is an attractive target for a universal influenza A vaccine: the M2e sequence is highly conserved across influenza virus subtypes, and induced humoral anti-M2e immunity protects against a lethal influenza virus challenge in animal models. Clinical phase I studies with M2e vaccine candidates have been completed. However, the in vivo mechanism of immune protection induced by M2e-carrier vaccination is unclear. Using passive immunization experiments in wild-type, FcR gamma(-/-), Fc gamma RI-/-, Fc gamma RIII-/-, and (Fc gamma RI, Fc gamma RIII)(-/-) mice, we report in this study that Fc receptors are essential for anti-M2e IgG-mediated immune protection. M2e-specific IgG1 isotype Abs are shown to require functional Fc gamma RIII for in vivo immune protection but other anti-M2e IgG isotypes can rescue Fc gamma RIII-/- mice from a lethal challenge. Using a conditional cell depletion protocol, we also demonstrate that alveolar macrophages (AM) play a crucial role in humoral M2e-specific immune protection. Additionally, we show that adoptive transfer of wild-type AM into (Fc gamma RI, Fc gamma RIII)(-/-) mice restores protection by passively transferred anti-M2e IgG. We conclude that AM and Fc receptor-dependent elimination of influenza A virus-infected cells are essential for protection by anti-M2e IgG

    Validating a double Gaussian source model for small proton fields in a commercial Monte-Carlo dose calculation engine

    No full text
    Purpose: The primary fluence of a proton pencil beam exiting the accelerator is enveloped by a region of secondaries, commonly called “spray”. Although small in magnitude, this spray may affect dose distributions in pencil beam scanning mode e.g., in the calculation of the small field output, if not modelled properly in a treatment planning system (TPS). The purpose of this study was to dosimetrically benchmark the Monte Carlo (MC) dose engine of the RayStation TPS (v.10A) in small proton fields and systematically compare single Gaussian (SG) and double Gaussian (DG) modeling of initial proton fluence providing a more accurate representation of the nozzle spray. Methods: The initial proton fluence distribution for SG/DG beam modeling was deduced from two-dimensional measurements in air with a scintillation screen with electronic readout. The DG model was either based on direct fits of the two Gaussians to the measured profiles, or by an iterative optimization procedure, which uses the measured profiles to mimic in-air scan-field factor (SF) measurements. To validate the DG beam models SFs, i.e. relative doses to a 10 × 10 cm2 field, were measured in water for three different initial proton energies (100MeV, 160MeV, 226.7MeV) and square field sizes from 1×1cm2 to 10×10cm2 using a small field ionization chamber (IBA CC01) and an IBA ProteusPlus system (universal nozzle). Furthermore, the dose to the center of spherical target volumes (diameters: 1cm to 10cm) was determined using the same small volume ionization chamber (IC). A comprehensive uncertainty analysis was performed, including estimates of influence factors typical for small field dosimetry deduced from a simple two-dimensional analytical model of the relative fluence distribution. Measurements were compared to the predictions of the RayStation TPS. Results: SFs deviated by more than 2% from TPS predictions in all fields <4×4cm2 with a maximum deviation of 5.8% for SG modeling. In contrast, deviations were smaller than 2% for all field-sizes and proton energies when using the directly fitted DG model. The optimized DG model performed similarly except for slightly larger deviations in the 1×1cm2 scan-fields. The uncertainty estimates showed a significant impact of pencil beam size variations (±5%) resulting in up to 5.0% standard uncertainty. The point doses within spherical irradiation volumes deviated from calculations by up to 3.3% for the SG model and 2.0% for the DG model. Conclusion: Properly representing nozzle spray in RayStation's MC-based dose engine using a DG beam model was found to reduce the deviation to measurements in small spherical targets to below 2%. A thorough uncertainty analysis shows a similar magnitude for the combined standard uncertainty of such measurements
    corecore