9 research outputs found

    An educational path for the magnetic vector potential and its physical implications

    Get PDF
    We present an educational path on the magnetic vector potential A addressed to undergraduate students and to pre-service physics teachers. Starting from the generalized Ampere-Laplace law, in the framework of a slowly varying time-dependent field approximation, the magnetic vector potential is written in terms of its empirical referent, i. e. the conduction current. Therefore, once the currents are known, our approach allows a clear and univocal physical determination of A overcoming the mathematical indeterminacy due to the gauge transformations. We have no need to fix a gauge, since for slowly varying time-dependent electric and magnetic fields, the natural gauge for A is the Coulomb one. We stress the difference between our approach and those usually presented in the literature. Finally, a physical interpretation of the magnetic vector potential is discussed and some examples of calculation of A are analysed

    Dual adversarial deconfounding autoencoder for joint batch-effects removal from multi-center and multi-scanner radiomics data

    Get PDF
    Abstract Medical imaging represents the primary tool for investigating and monitoring several diseases, including cancer. The advances in quantitative image analysis have developed towards the extraction of biomarkers able to support clinical decisions. To produce robust results, multi-center studies are often set up. However, the imaging information must be denoised from confounding factors—known as batch-effect—like scanner-specific and center-specific influences. Moreover, in non-solid cancers, like lymphomas, effective biomarkers require an imaging-based representation of the disease that accounts for its multi-site spreading over the patient’s body. In this work, we address the dual-factor deconfusion problem and we propose a deconfusion algorithm to harmonize the imaging information of patients affected by Hodgkin Lymphoma in a multi-center setting. We show that the proposed model successfully denoises data from domain-specific variability (p-value < 0.001) while it coherently preserves the spatial relationship between imaging descriptions of peer lesions (p-value = 0), which is a strong prognostic biomarker for tumor heterogeneity assessment. This harmonization step allows to significantly improve the performance in prognostic models with respect to state-of-the-art methods, enabling building exhaustive patient representations and delivering more accurate analyses (p-values < 0.001 in training, p-values < 0.05 in testing). This work lays the groundwork for performing large-scale and reproducible analyses on multi-center data that are urgently needed to convey the translation of imaging-based biomarkers into the clinical practice as effective prognostic tools. The code is available on GitHub at this https://github.com/LaraCavinato/Dual-ADAE

    Embedding Physics into technology: Infrared thermography and building inspection as a teaching tool - a new participated strategy approach to the physics of heat transfer and energy saving for professional schools

    No full text
    We describe an inquiry-based path about heat conduction as part of a multidisciplinary project on energy saving in a professional school in a province close to Milan, Italy. The teaching–learning process dealt with heat losses in buildings detected with a thermal camera. Three consecutive activities were implemented: direct detection by the students of heat leakages due to thermal bridges in the school structure; simple standard technology laboratory activities on heat transfer, planned and performed by the students themselves; and finally a series of guided laboratory experiences with a thermal camera, to develop and clarify the previous lab activities on thermal conductivity. Key motivations of the project were creating a link between the study of thermodynamics and its application to the “real” world; increasing students’ motivation by using an Inquiry Based Science Education (IBSE) approach; and studying if and how the “infusion” of a cutting-edge, and therefore science-attracting, technology (thermography) might foster the teaching–learning process, thus becoming a concrete cognitive tool promoting the students’ approach to the scientific methodology.The accepted manuscript in pdf format is listed with the files at the bottom of this page. The presentation of the authors' names and (or) special characters in the title of the manuscript may differ slightly between what is listed on this page and what is listed in the pdf file of the accepted manuscript; that in the pdf file of the accepted manuscript is what was submitted by the author

    Brain-Computer Interface in Chronic Stroke: sensorimotor closed-loop and contingent force feedback make the difference

    No full text
    Motor rehabilitation from stroke injury is of topic importance nowadays that neurological diseases have been become a medical urgency. Brain-Computer interfaces have been demonstrated to be helpful in the recovery of motor functions: In particular, the closed loop involving sensorimotor brain rhythms, assistive-robot training and proprioceptive feedback in an operant learning fashion is suspected to be the most effective way to promote the neural plasticity of the ipilesional hemisphere and to restore motor abilities. This study aimed at implementing such a scheme: One stroke patient in the chronic state was recluted and underwent the experiment both using the damaged arm and the healthy one, considered as control during the following analyses. Kinematic and neurophysiological outcomes confirmed the efficacy of this treatment and showed that a contingent force feedback can definitely improve motor accuracy of the upper limb

    On the Use of Transfer Entropy to Investigate the Time Horizon of Causal Influences between Signals

    No full text
    Understanding the details of the correlation between time series is an essential step on the route to assessing the causal relation between systems. Traditional statistical indicators, such as the Pearson correlation coefficient and the mutual information, have some significant limitations. More recently, transfer entropy has been proposed as a powerful tool to understand the flow of information between signals. In this paper, the comparative advantages of transfer entropy, for determining the time horizon of causal influence, are illustrated with the help of synthetic data. The technique has been specifically revised for the analysis of synchronization experiments. The investigation of experimental data from thermonuclear plasma diagnostics proves the potential and limitations of the developed approach

    Maximum likelihood bolometric tomography for the determination of the uncertainties in the radiation emission on JET TOKAMAK

    No full text
    The total emission of radiation is a crucial quantity to calculate the power balances and to understand the physics of any Tokamak. Bolometric systems are the main tool to measure this important physical quantity through quite sophisticated tomographic inversion methods. On the Joint European Torus, the coverage of the bolometric diagnostic, due to the availability of basically only two projection angles, is quite limited, rendering the inversion a very ill-posed mathematical problem. A new approach, based on the maximum likelihood, has therefore been developed and implemented to alleviate one of the major weaknesses of traditional tomographic techniques: the difficulty to determine routinely the confidence intervals in the results. The method has been validated by numerical simulations with phantoms to assess the quality of the results and to optimise the configuration of the parameters for the main types of emissivity encountered experimentally. The typical levels of statistical errors, which may significantly influence the quality of the reconstructions, have been identified. The systematic tests with phantoms indicate that the errors in the reconstructions are quite limited and their effect on the total radiated power remains well below 10%. A comparison with other approaches to the inversion and to the regularization has also been performed

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)

    No full text
    In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field
    corecore