2,124 research outputs found

    A scheduling theory framework for GPU tasks efficient execution

    Get PDF
    Concurrent execution of tasks in GPUs can reduce the computation time of a workload by overlapping data transfer and execution commands. However it is difficult to implement an efficient run- time scheduler that minimizes the workload makespan as many execution orderings should be evaluated. In this paper, we employ scheduling theory to build a model that takes into account the device capabili- ties, workload characteristics, constraints and objec- tive functions. In our model, GPU tasks schedul- ing is reformulated as a flow shop scheduling prob- lem, which allow us to apply and compare well known methods already developed in the operations research field. In addition we develop a new heuristic, specif- ically focused on executing GPU commands, that achieves better scheduling results than previous tech- niques. Finally, a comprehensive evaluation, showing the suitability and robustness of this new approach, is conducted in three different NVIDIA architectures (Kepler, Maxwell and Pascal).Proyecto TIN2016- 0920R, Universidad de Málaga (Campus de Excelencia Internacional Andalucía Tech) y programa de donación de NVIDIA Corporation

    Graphics for uncertainty

    Get PDF
    Graphical methods such as colour shading and animation, which are widely available, can be very effective in communicating uncertainty. In particular, the idea of a ‘density strip’ provides a conceptually simple representation of a distribution and this is explored in a variety of settings, including a comparison of means, regression and models for contingency tables. Animation is also a very useful device for exploring uncertainty and this is explored particularly in the context of flexible models, expressed in curves and surfaces whose structure is of particular interest. Animation can further provide a helpful mechanism for exploring data in several dimensions. This is explored in the simple but very important setting of spatiotemporal data

    An intelligent assistant for exploratory data analysis

    Get PDF
    In this paper we present an account of the main features of SNOUT, an intelligent assistant for exploratory data analysis (EDA) of social science survey data that incorporates a range of data mining techniques. EDA has much in common with existing data mining techniques: its main objective is to help an investigator reach an understanding of the important relationships ina data set rather than simply develop predictive models for selectd variables. Brief descriptions of a number of novel techniques developed for use in SNOUT are presented. These include heuristic variable level inference and classification, automatic category formation, the use of similarity trees to identify groups of related variables, interactive decision tree construction and model selection using a genetic algorithm

    Further Investigation of the Time Delay, Magnification Ratios, and Variability in the Gravitational Lens 0218+357

    Get PDF
    High precision VLA flux density measurements for the lensed images of 0218+357 yield a time delay of 10.1(+1.5-1.6)days (95% confidence). This is consistent with independent measurements carried out at the same epoch (Biggs et al. 1999), lending confidence in the robustness of the time delay measurement. However, since both measurements make use of the same features in the light curves, it is possible that the effects of unmodelled processes, such as scintillation or microlensing, are biasing both time delay measurements in the same way. Our time delay estimates result in confidence intervals that are somewhat larger than those of Biggs et al., probably because we adopt a more general model of the source variability, allowing for constant and variable components. When considered in relation to the lens mass model of Biggs et al., our best-fit time delay implies a Hubble constant of H_o = 71(+17-23) km/s-Mpc for Omega_o=1 and lambda_o=0 (95% confidence; filled beam). This confidence interval for H_o does not reflect systematic error, which may be substantial, due to uncertainty in the position of the lens galaxy. We also measure the flux ratio of the variable components of 0218+357, a measurement of a small region that should more closely represent the true lens magnification ratio. We find ratios of 3.2(+0.3-0.4) (95% confidence; 8 GHz) and 4.3(+0.5-0.8) (15 GHz). Unlike the reported flux ratios on scales of 0.1", these ratios are not strongly significantly different. We investigate the significance of apparent differences in the variability properties of the two images of the background active galactic nucleus. We conclude that the differences are not significant, and that time series much longer than our 100-day time series will be required to investigate propagation effects in this way.Comment: 33 pages, 9 figures. Accepted for publication in ApJ. Light curve data may be found at http://space.mit.edu/RADIO/papers.htm

    The effect of stimulus variability on children\u2019s judgements of quantity

    Get PDF
    This study investigates the effect of stimulus variability on development of the ability to make quantity judgements related to area. Participants were 241 children (aged 4, 5, 6, 8, and 12 years) and 82 university students, who were asked to compare the quantities in 2 sets of 5 chocolate bars of constant width but variable length. Participants indicated which set contained more chocolate or that the amounts of chocolate were equal. Judgement accuracy of 12-year-olds and adults decreased monotonically as the variance of bar lengths increased. In younger children, performance was low when variance was very low or very high, but accuracy was higher for intermediate levels of variance, thus resulting in an inverted U-shaped effect. This pattern was confirmed in a second experiment in which we controlled for a possible age-related response bias against \u201cequal\u201d judgements. Findings suggest that judgements of quantity are based on a mixture of learned heuristics and comparisons of approximate quantity representations, both of which develop throughout childhood

    Inference with interference between units in an fMRI experiment of motor inhibition

    Full text link
    An experimental unit is an opportunity to randomly apply or withhold a treatment. There is interference between units if the application of the treatment to one unit may also affect other units. In cognitive neuroscience, a common form of experiment presents a sequence of stimuli or requests for cognitive activity at random to each experimental subject and measures biological aspects of brain activity that follow these requests. Each subject is then many experimental units, and interference between units within an experimental subject is likely, in part because the stimuli follow one another quickly and in part because human subjects learn or become experienced or primed or bored as the experiment proceeds. We use a recent fMRI experiment concerned with the inhibition of motor activity to illustrate and further develop recently proposed methodology for inference in the presence of interference. A simulation evaluates the power of competing procedures.Comment: Published by Journal of the American Statistical Association at http://www.tandfonline.com/doi/full/10.1080/01621459.2012.655954 . R package cin (Causal Inference for Neuroscience) implementing the proposed method is freely available on CRAN at https://CRAN.R-project.org/package=ci

    Currency Unions and Trade: A PPML Re-Assessment with High-Dimensional Fixed Effects

    Get PDF
    Recent work on the effects of currency unions (CUs) on trade stresses the importance of using many countries and years in order to obtain reliable estimates. However, for large samples, computational issues associated with the three-way (exporter-time, importer-time, and country-pair) fixed effects currently recommended in the gravity literature have heretofore limited the choice of estimator, leaving an important methodological gap. To address this gap, we introduce an iterative Poisson Pseudo-Maximum Likelihood (PPML) estimation procedure that facilitates the inclusion of these fixed effects for large data sets and also allows for correlated errors across countries and time. When applied to a comprehensive sample with more than 200 countries trading over 65 years, these innovations flip the conclusions of an otherwise rigorously-specified linear model. Most importantly, our estimates for both the overall CU effect and the Euro effect specifically are economically small and statistically insignificant. We also document that linear and PPML estimates of the Euro effect increasingly diverge as the sample size grows

    Radio-optically selected clusters of galaxies. II. The cluster sample

    Get PDF
    We present a sample of 171 candidate groups and clusters of galaxies at intermediate redshift over an area of ~550 square degrees at the South Galactic Pole selected by using optically identified radio sources from the NRAO VLA Sky Survey as tracers of dense environment. Out of these 171 candidates, 76 have a counterpart in the literature while 95 of them are previously unknown clusters. This paper presents the cluster selection technique, based on the search of excesses in the optical surface density of galaxies near identified radiogalaxies, and the first spectroscopic results aimed to confirm the presence of a cluster. Spectroscopy for 11 candidates led to the detection of 9 clusters at redshift in the range 0.13 - 0.3, with estimated velocity dispersions ranging from values typical of clusters to those of galaxy groups. These results show that this technique represents a powerful tool for the selection of homogeneous samples of intermediate redshift clusters over a wide range of richness.Comment: 13 pages, 3 Postscript and 2 GIF figures. Accepted for publication in A&

    Methods and Algorithms for Robust Filtering

    Get PDF
    We discuss filtering procedures for robust extraction of a signal from noisy time series. Moving averages and running medians are standard methods for this, but they have shortcomings when large spikes (outliers) respectively trends occur. Modified trimmed means and linear median hybrid filters combine advantages of both approaches, but they do not completely overcome the difficulties. Improvements can be achieved by using robust regression methods, which work even in real time because of increased computational power and faster algorithms. Extending recent work we present filters for robust online signal extraction and discuss their merits for preserving trends, abrupt shifts and extremes and for the removal of spikes
    corecore