1,321 research outputs found

    Statistically Stable Estimates of Variance in Radioastronomical Observations as Tools for RFI Mitigation

    Full text link
    A selection of statistically stable (robust) algorithms for data variance calculating has been made. Their properties have been analyzed via computer simulation. These algorithms would be useful if adopted in radio astronomy observations in the presence of strong sporadic radio frequency interference (RFI). Several observational results have been presented here to demonstrate the effectiveness of these algorithms in RFI mitigation

    Reconstructing the primordial power spectrum from the CMB

    Full text link
    We propose a straightforward and model independent methodology for characterizing the sensitivity of CMB and other experiments to wiggles, irregularities, and features in the primordial power spectrum. Assuming that the primordial cosmological perturbations are adiabatic, we present a function space generalization of the usual Fisher matrix formalism, applied to a CMB experiment resembling Planck with and without ancillary data. This work is closely related to other work on recovering the inflationary potential and exploring specific models of non-minimal, or perhaps baroque, primordial power spectra. The approach adopted here, however, most directly expresses what the data is really telling us. We explore in detail the structure of the available information and quantify exactly what features can be reconstructed and at what statistical significance.Comment: 43 pages Revtex, 23 figure

    An intelligent assistant for exploratory data analysis

    Get PDF
    In this paper we present an account of the main features of SNOUT, an intelligent assistant for exploratory data analysis (EDA) of social science survey data that incorporates a range of data mining techniques. EDA has much in common with existing data mining techniques: its main objective is to help an investigator reach an understanding of the important relationships ina data set rather than simply develop predictive models for selectd variables. Brief descriptions of a number of novel techniques developed for use in SNOUT are presented. These include heuristic variable level inference and classification, automatic category formation, the use of similarity trees to identify groups of related variables, interactive decision tree construction and model selection using a genetic algorithm

    Further Investigation of the Time Delay, Magnification Ratios, and Variability in the Gravitational Lens 0218+357

    Get PDF
    High precision VLA flux density measurements for the lensed images of 0218+357 yield a time delay of 10.1(+1.5-1.6)days (95% confidence). This is consistent with independent measurements carried out at the same epoch (Biggs et al. 1999), lending confidence in the robustness of the time delay measurement. However, since both measurements make use of the same features in the light curves, it is possible that the effects of unmodelled processes, such as scintillation or microlensing, are biasing both time delay measurements in the same way. Our time delay estimates result in confidence intervals that are somewhat larger than those of Biggs et al., probably because we adopt a more general model of the source variability, allowing for constant and variable components. When considered in relation to the lens mass model of Biggs et al., our best-fit time delay implies a Hubble constant of H_o = 71(+17-23) km/s-Mpc for Omega_o=1 and lambda_o=0 (95% confidence; filled beam). This confidence interval for H_o does not reflect systematic error, which may be substantial, due to uncertainty in the position of the lens galaxy. We also measure the flux ratio of the variable components of 0218+357, a measurement of a small region that should more closely represent the true lens magnification ratio. We find ratios of 3.2(+0.3-0.4) (95% confidence; 8 GHz) and 4.3(+0.5-0.8) (15 GHz). Unlike the reported flux ratios on scales of 0.1", these ratios are not strongly significantly different. We investigate the significance of apparent differences in the variability properties of the two images of the background active galactic nucleus. We conclude that the differences are not significant, and that time series much longer than our 100-day time series will be required to investigate propagation effects in this way.Comment: 33 pages, 9 figures. Accepted for publication in ApJ. Light curve data may be found at http://space.mit.edu/RADIO/papers.htm

    Generalized methods and solvers for noise removal from piecewise constant signals. I. Background theory

    Get PDF
    Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play

    Inference with interference between units in an fMRI experiment of motor inhibition

    Full text link
    An experimental unit is an opportunity to randomly apply or withhold a treatment. There is interference between units if the application of the treatment to one unit may also affect other units. In cognitive neuroscience, a common form of experiment presents a sequence of stimuli or requests for cognitive activity at random to each experimental subject and measures biological aspects of brain activity that follow these requests. Each subject is then many experimental units, and interference between units within an experimental subject is likely, in part because the stimuli follow one another quickly and in part because human subjects learn or become experienced or primed or bored as the experiment proceeds. We use a recent fMRI experiment concerned with the inhibition of motor activity to illustrate and further develop recently proposed methodology for inference in the presence of interference. A simulation evaluates the power of competing procedures.Comment: Published by Journal of the American Statistical Association at http://www.tandfonline.com/doi/full/10.1080/01621459.2012.655954 . R package cin (Causal Inference for Neuroscience) implementing the proposed method is freely available on CRAN at https://CRAN.R-project.org/package=ci

    Emotional Sentence Annotation Helps Predict Fiction Genre

    Get PDF
    Fiction, a prime form of entertainment, has evolved into multiple genres which one can broadly attribute to different forms of stories. In this paper, we examine the hypothesis that works of fiction can be characterised by the emotions they portray. To investigate this hypothesis, we use the work of fictions in the Project Gutenberg and we attribute basic emotional content to each individual sentence using Ekman’s model. A time-smoothed version of the emotional content for each basic emotion is used to train extremely randomized trees. We show through 10-fold Cross-Validation that the emotional content of each work of fiction can help identify each genre with significantly higher probability than random. We also show that the most important differentiator between genre novels is fear

    A geometric approach to visualization of variability in functional data

    Get PDF
    We propose a new method for the construction and visualization of boxplot-type displays for functional data. We use a recent functional data analysis framework, based on a representation of functions called square-root slope functions, to decompose observed variation in functional data into three main components: amplitude, phase, and vertical translation. We then construct separate displays for each component, using the geometry and metric of each representation space, based on a novel definition of the median, the two quartiles, and extreme observations. The outlyingness of functional data is a very complex concept. Thus, we propose to identify outliers based on any of the three main components after decomposition. We provide a variety of visualization tools for the proposed boxplot-type displays including surface plots. We evaluate the proposed method using extensive simulations and then focus our attention on three real data applications including exploratory data analysis of sea surface temperature functions, electrocardiogram functions and growth curves

    Electromagnetic vertex function of the pion at T > 0

    Full text link
    The matrix element of the electromagnetic current between pion states is calculated in quenched lattice QCD at a temperature of T=0.93TcT = 0.93 T_c. The nonperturbatively improved Sheikholeslami-Wohlert action is used together with the corresponding O(a){\cal O}(a) improved vector current. The electromagnetic vertex function is extracted for pion masses down to 360MeV360 {\rm MeV} and momentum transfers Q2≤2.7GeV2Q^2 \le 2.7 {\rm GeV}^2.Comment: 17 pages, 8 figure
    • …
    corecore