1,347 research outputs found

    Recent Publications

    Get PDF

    Segmentation of Loops from Coronal EUV Images

    Get PDF
    We present a procedure which extracts bright loop features from solar EUV images. In terms of image intensities, these features are elongated ridge-like intensity maxima. To discriminate the maxima, we need information about the spatial derivatives of the image intensity. Commonly, the derivative estimates are strongly affected by image noise. We therefore use a regularized estimation of the derivative which is then used to interpolate a discrete vector field of ridge points ``ridgels'' which are positioned on the ridge center and have the intrinsic orientation of the local ridge direction. A scheme is proposed to connect ridgels to smooth, spline-represented curves which fit the observed loops. Finally, a half-automated user interface allows one to merge or split, eliminate or select loop fits obtained form the above procedure. In this paper we apply our tool to one of the first EUV images observed by the SECCHI instrument onboard the recently launched STEREO spacecraft. We compare the extracted loops with projected field lines computed from almost-simultaneously-taken magnetograms measured by the SOHO/MDI Doppler imager. The field lines were calculated using a linear force-free field model. This comparison allows one to verify faint and spurious loop connections produced by our segmentation tool and it also helps to prove the quality of the magnetic-field model where well-identified loop structures comply with field-line projections. We also discuss further potential applications of our tool such as loop oscillations and stereoscopy.Comment: 13 pages, 9 figures, Solar Physics, online firs

    The application of compressive sampling to radio astronomy I: Deconvolution

    Full text link
    Compressive sampling is a new paradigm for sampling, based on sparseness of signals or signal representations. It is much less restrictive than Nyquist-Shannon sampling theory and thus explains and systematises the widespread experience that methods such as the H\"ogbom CLEAN can violate the Nyquist-Shannon sampling requirements. In this paper, a CS-based deconvolution method for extended sources is introduced. This method can reconstruct both point sources and extended sources (using the isotropic undecimated wavelet transform as a basis function for the reconstruction step). We compare this CS-based deconvolution method with two CLEAN-based deconvolution methods: the H\"ogbom CLEAN and the multiscale CLEAN. This new method shows the best performance in deconvolving extended sources for both uniform and natural weighting of the sampled visibilities. Both visual and numerical results of the comparison are provided.Comment: Published by A&A, Matlab code can be found: http://code.google.com/p/csra/download

    Compression coil provides increased lead control in extraction procedures

    Get PDF
    Aims We investigated a new lead extraction tool (Compression Coil; One-Tie, Cook Medical) in an experimental traction force study. Methods and results On 13 pacemaker leads (Setrox JS53, Biotronik) traction force testing was performed under different configurations. The leads were assigned to three groups: (i) traction force testing without central locking stylet support (n = 5), (ii) traction force testing with the use of a locking stylet (Liberator, Cook Medical) and a proximal ligation suture (n = 4), (iii) traction force testing with the use of a locking stylet and a compression coil (n = 4). The following parameters were obtained for all groups: stress-strain curves, maximal forces, elastic modulus, post-testing lead length and lead elongation. In Groups 2 and 3 retraction of the locking stylet within the lead was measured [lead tip-locking stylet distance (LTLSD)]. Maximal forces for the three groups were: (i) 28.3 ± 0.3 N; (ii) 30.6 ± 3.0 N; (iii) 31.6 ± 2.9 N (1 vs. 2, P = 0.13; 1 vs. 3, P = 0.04; 2 vs. 3, P = 0.65). Elastic modulus was (i) 22.8 ± 0.1 MPa; (ii) 2830.8 ± 351.1 MPa; (iii) 2447.0 ± 510.5 MPa (1 vs. 2, P < 0.01; 1 vs. 3, P < 0.01; 2 vs. 3, P = 0.26). Mean LTLSD in Group 2 was 19.8 ± 3.2 cm and was 13.8 ± 1.7 cm in Group 3 (P = 0.02). The ratio of LTLSD/post-testing lead length was 0.37 ± 0.03 for Group 2 and 0.24 ± 0.03 for Group 3 (P < 0.01). Conclusion The application of a compression coil leads to an increased lead control expressed by less retraction of the locking stylet within the lead. This enables improved central support of extraction sheaths in the case of challenging extraction procedure

    HumanRF: High-Fidelity Neural Radiance Fields for Humans in Motion

    Get PDF
    Representing human performance at high-fidelity is an essential building block in diverse applications, such as film production, computer games or videoconferencing. To close the gap to production-level quality, we introduce HumanRF1, a 4D dynamic neural scene representation that captures full-body appearance in motion from multi-view video input, and enables playback from novel, unseen viewpoints. Our novel representation acts as a dynamic video encoding that captures fine details at high compression rates by factorizing space-time into a temporal matrix-vector decomposition. This allows us to obtain temporally coherent reconstructions of human actors for long sequences, while representing high-resolution details even in the context of challenging motion. While most research focuses on synthesizing at resolutions of 4MP or lower, we address the challenge of operating at 12MP. To this end, we introduce ActorsHQ, a novel multi-view dataset that provides 12MP footage from 160 cameras for 16 sequences with high-fidelity, per-frame mesh reconstructions2. We demonstrate challenges that emerge from using such high-resolution data and show that our newly introduced HumanRF effectively leverages this data, making a significant step towards production-level quality novel view synthesis

    A Comparative Study of Population-Graph Construction Methods and Graph Neural Networks for Brain Age Regression

    Full text link
    The difference between the chronological and biological brain age of a subject can be an important biomarker for neurodegenerative diseases, thus brain age estimation can be crucial in clinical settings. One way to incorporate multimodal information into this estimation is through population graphs, which combine various types of imaging data and capture the associations among individuals within a population. In medical imaging, population graphs have demonstrated promising results, mostly for classification tasks. In most cases, the graph structure is pre-defined and remains static during training. However, extracting population graphs is a non-trivial task and can significantly impact the performance of Graph Neural Networks (GNNs), which are sensitive to the graph structure. In this work, we highlight the importance of a meaningful graph construction and experiment with different population-graph construction methods and their effect on GNN performance on brain age estimation. We use the homophily metric and graph visualizations to gain valuable quantitative and qualitative insights on the extracted graph structures. For the experimental evaluation, we leverage the UK Biobank dataset, which offers many imaging and non-imaging phenotypes. Our results indicate that architectures highly sensitive to the graph structure, such as Graph Convolutional Network (GCN) and Graph Attention Network (GAT), struggle with low homophily graphs, while other architectures, such as GraphSage and Chebyshev, are more robust across different homophily ratios. We conclude that static graph construction approaches are potentially insufficient for the task of brain age estimation and make recommendations for alternative research directions.Comment: Accepted at GRAIL, MICCAI 202

    Treatment with higher dosages of heart failure medication is associated with improved outcome following cardiac resynchronization therapy

    Get PDF
    Background Cardiac resynchronization therapy (CRT) is associated with improved morbidity and mortality in patients with chronic heart failure (CHF) on optimal medical therapy. The impact of CHF medication optimization following CRT, however, has never been comprehensively evaluated. In the current study, we therefore investigated the effect of CHF medication dosage on morbidity and mortality in CHF patients after CRT implantation. Methods and results Chronic heart failure medication was assessed in 185 patients after CRT implantation. During an overall mean follow-up of 44.6 months, 83 patients experienced a primary endpoint (death, heart transplantation, assist device implantation, or hospitalization for CHF). Treatment with higher dosages of angiotensin-converting enzyme inhibitor (ACE-I) or angiotensin receptor blockers (ARBs) (P = 0.001) and beta-blockers (P < 0.001) as well as with lower dosages of loop diuretics (P < 0.001) was associated with a reduced risk for the primary combined endpoint as well as for all-cause mortality. Echocardiographic super-responders to CRT were treated with higher average dosages of ACE-I/ARBs (68.1 vs. 52.4%, P < 0.01) and beta-blockers (59 vs. 42.2%, P < 0.01). During follow-up, the average dosage of loop diuretics was decreased by 20% in super-responders, but increased by 30% in non-super-responders (P < 0.03). Conclusion The use of higher dosages of neurohormonal blockers and lower dosages of diuretics is associated with reduced morbidity and mortality following CRT implantation. Our data imply a beneficial effect of increasing neurohormonal blockade whenever possible following CRT implantatio

    Duchamp: a 3D source finder for spectral-line data

    Full text link
    This paper describes the Duchamp source finder, a piece of software designed to find and describe sources in 3-dimensional, spectral-line data cubes. Duchamp has been developed with HI (neutral hydrogen) observations in mind, but is widely applicable to many types of astronomical images. It features efficient source detection and handling methods, noise suppression via smoothing or multi-resolution wavelet reconstruction, and a range of graphical and text-based outputs to allow the user to understand the detections. This paper details some of the key algorithms used, and illustrates the effectiveness of the finder on different data sets.Comment: MNRAS, in press. 17 pages, 8 figure

    Do maternal environmental conditions during reproductive development induce genotypic selection in Picea abies?

    Get PDF
    In forest trees, environmental conditions during reproduction can greatly influence progeny performance. This phenomenon probably results from adaptive phenotypic plasticity but also may be associated with genotypic selection. In order to determine whether selective effects during the reproduction are environment specific, single pair-crosses of Norway spruce were studied in two contrasted maternal environments (warm and cold conditions). One family expressed large and the other small phenotypic differences between these crossing environments. The inheritance of genetic polymorphism was analysed at the seed stage. Four parental genetic maps covering 66 to 78% of the genome were constructed using 190 to 200 loci. After correcting for multiple testing, there is no evidence of locus under strong and repeatable selection. The maternal environment could thus only induce limited genotypic-selection effects during reproductive steps, and performance of progenies may be mainly affected by a long-lasting epigenetic memory regulated by temperature and photoperiod prevailing during seed production.L'environnement maternel induit-il une sélection génotypique durant les différents stades de reproduction chez Picea abies ?. Chez les arbres forestiers, les conditions environnementales durant la reproduction peuvent influencer les performances des descendants. Ce phénomène reflète probablement la plasticité phénotypique, mais également il pourrait être associé à une sélection génotypique. Afin de déterminer si des effets sélectifs durant la reproduction sont spécifiques d'un environnement donné, deux familles d'épicéa commun non apparentées ont été obtenues par croisements dirigés dans deux environnements maternels contrastés (conditions chaude et froide). La première famille exprimait de larges différences phénotypiques entre les deux environnements tandis que la seconde ne montrait pas de différence significative. La transmission des polymorphismes génétiques a été étudiée au stade de la graine. Quatre cartes génétiques parentales couvrant 66 à 78 % du génome ont été construites. Aucun effet de sélection n'a été mis en évidence aux différents locus étudiés. L'environnement maternel n'induirait donc que des effets de sélection génotypique relativement faibles durant les stades de la reproduction. Les performances des descendants seraient principalement affectées par une mémoire épigénétique durable régulée par la température et la photopériode régnant durant la production des graines
    corecore