107,872 research outputs found

    Localisation of gamma-ray interaction points in thick monolithic CeBr3 and LaBr3:Ce scintillators

    Full text link
    Localisation of gamma-ray interaction points in monolithic scintillator crystals can simplify the design and improve the performance of a future Compton telescope for gamma-ray astronomy. In this paper we compare the position resolution of three monolithic scintillators: a 28x28x20 mm3 (length x breadth x thickness) LaBr3:Ce crystal, a 25x25x20 mm3 CeBr3 crystal and a 25x25x10 mm3 CeBr3 crystal. Each crystal was encapsulated and coupled to an array of 4x4 silicon photomultipliers through an optical window. The measurements were conducted using 81 keV and 356 keV gamma-rays from a collimated 133Ba source. The 3D position reconstruction of interaction points was performed using artificial neural networks trained with experimental data. Although the position resolution was significantly better for the thinner crystal, the 20 mm thick CeBr3 crystal showed an acceptable resolution of about 5.4 mm FWHM for the x and y coordinates, and 7.8 mm FWHM for the z-coordinate (crystal depth) at 356 keV. These values were obtained from the full position scans of the crystal sides. The position resolution of the LaBr3:Ce crystal was found to be considerably worse, presumably due to the highly diffusive optical in- terface between the crystal and the optical window of the enclosure. The energy resolution (FWHM) measured for 662 keV gamma-rays was 4.0% for LaBr3:Ce and 5.5% for CeBr3. The same crystals equipped with a PMT (Hamamatsu R6322-100) gave an energy resolution of 3.0% and 4.7%, respectively

    Efficient transfer entropy analysis of non-stationary neural time series

    Full text link
    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these observations, available estimators assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that deals with the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method. We test the performance and robustness of our implementation on data from simulated stochastic processes and demonstrate the method's applicability to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscientific data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems.Comment: 27 pages, 7 figures, submitted to PLOS ON

    Generative Model with Coordinate Metric Learning for Object Recognition Based on 3D Models

    Full text link
    Given large amount of real photos for training, Convolutional neural network shows excellent performance on object recognition tasks. However, the process of collecting data is so tedious and the background are also limited which makes it hard to establish a perfect database. In this paper, our generative model trained with synthetic images rendered from 3D models reduces the workload of data collection and limitation of conditions. Our structure is composed of two sub-networks: semantic foreground object reconstruction network based on Bayesian inference and classification network based on multi-triplet cost function for avoiding over-fitting problem on monotone surface and fully utilizing pose information by establishing sphere-like distribution of descriptors in each category which is helpful for recognition on regular photos according to poses, lighting condition, background and category information of rendered images. Firstly, our conjugate structure called generative model with metric learning utilizing additional foreground object channels generated from Bayesian rendering as the joint of two sub-networks. Multi-triplet cost function based on poses for object recognition are used for metric learning which makes it possible training a category classifier purely based on synthetic data. Secondly, we design a coordinate training strategy with the help of adaptive noises acting as corruption on input images to help both sub-networks benefit from each other and avoid inharmonious parameter tuning due to different convergence speed of two sub-networks. Our structure achieves the state of the art accuracy of over 50\% on ShapeNet database with data migration obstacle from synthetic images to real photos. This pipeline makes it applicable to do recognition on real images only based on 3D models.Comment: 14 page

    A non-homogeneous dynamic Bayesian network with sequentially coupled interaction parameters for applications in systems and synthetic biology

    Get PDF
    An important and challenging problem in systems biology is the inference of gene regulatory networks from short non-stationary time series of transcriptional profiles. A popular approach that has been widely applied to this end is based on dynamic Bayesian networks (DBNs), although traditional homogeneous DBNs fail to model the non-stationarity and time-varying nature of the gene regulatory processes. Various authors have therefore recently proposed combining DBNs with multiple changepoint processes to obtain time varying dynamic Bayesian networks (TV-DBNs). However, TV-DBNs are not without problems. Gene expression time series are typically short, which leaves the model over-flexible, leading to over-fitting or inflated inference uncertainty. In the present paper, we introduce a Bayesian regularization scheme that addresses this difficulty. Our approach is based on the rationale that changes in gene regulatory processes appear gradually during an organism's life cycle or in response to a changing environment, and we have integrated this notion in the prior distribution of the TV-DBN parameters. We have extensively tested our regularized TV-DBN model on synthetic data, in which we have simulated short non-homogeneous time series produced from a system subject to gradual change. We have then applied our method to real-world gene expression time series, measured during the life cycle of Drosophila melanogaster, under artificially generated constant light condition in Arabidopsis thaliana, and from a synthetically designed strain of Saccharomyces cerevisiae exposed to a changing environment

    Reconstructed Rough Growing Interfaces; Ridgeline Trapping of Domain Walls

    Full text link
    We investigate whether surface reconstruction order exists in stationary growing states, at all length scales or only below a crossover length, lrecl_{\rm rec}. The later would be similar to surface roughness in growing crystal surfaces; below the equilibrium roughening temperature they evolve in a layer-by-layer mode within a crossover length scale lRl_{\rm R}, but are always rough at large length scales. We investigate this issue in the context of KPZ type dynamics and a checker board type reconstruction, using the restricted solid-on-solid model with negative mono-atomic step energies. This is a topology where surface reconstruction order is compatible with surface roughness and where a so-called reconstructed rough phase exists in equilibrium. We find that during growth, reconstruction order is absent in the thermodynamic limit, but exists below a crossover length lrec>lRl_{\rm rec}>l_{\rm R}, and that this local order fluctuates critically. Domain walls become trapped at the ridge lines of the rough surface, and thus the reconstruction order fluctuations are slaved to the KPZ dynamics

    Warming of the Antarctic ice-sheet surface since the 1957 International Geophysical Year

    Get PDF
    Assessments of Antarctic temperature change have emphasized the contrast between strong warming of the Antarctic Peninsula and slight cooling of the Antarctic continental interior in recent decades. This pattern of temperature change has been attributed to the increased strength of the circumpolar westerlies, largely in response to changes in stratospheric ozone. This picture, however, is substantially incomplete owing to the sparseness and short duration of the observations. Here we show that significant warming extends well beyond the Antarctic Peninsula to cover most of West Antarctica, an area of warming much larger than previously reported. West Antarctic warming exceeds 0.1 °C per decade over the past 50 years, and is strongest in winter and spring. Although this is partly offset by autumn cooling in East Antarctica, the continent-wide average near-surface temperature trend is positive. Simulations using a general circulation model reproduce the essential features of the spatial pattern and the long-term trend, and we suggest that neither can be attributed directly to increases in the strength of the westerlies. Instead, regional changes in atmospheric circulation and associated changes in sea surface temperature and sea ice are required to explain the enhanced warming in West Antarctica
    corecore