5,064 research outputs found

    Crowdsourcing Swarm Manipulation Experiments: A Massive Online User Study with Large Swarms of Simple Robots

    Full text link
    Micro- and nanorobotics have the potential to revolutionize many applications including targeted material delivery, assembly, and surgery. The same properties that promise breakthrough solutions---small size and large populations---present unique challenges to generating controlled motion. We want to use large swarms of robots to perform manipulation tasks; unfortunately, human-swarm interaction studies as conducted today are limited in sample size, are difficult to reproduce, and are prone to hardware failures. We present an alternative. This paper examines the perils, pitfalls, and possibilities we discovered by launching SwarmControl.net, an online game where players steer swarms of up to 500 robots to complete manipulation challenges. We record statistics from thousands of players, and use the game to explore aspects of large-population robot control. We present the game framework as a new, open-source tool for large-scale user experiments. Our results have potential applications in human control of micro- and nanorobots, supply insight for automatic controllers, and provide a template for large online robotic research experiments.Comment: 8 pages, 13 figures, to appear at 2014 IEEE International Conference on Robotics and Automation (ICRA 2014

    The influence of dust grain porosity on the analysis of debris disc observations

    Full text link
    Debris discs are often modelled assuming compact dust grains, but more and more evidence for the presence of porous grains is found. We aim at quantifying the systematic errors introduced when modelling debris discs composed of porous dust with a disc model assuming spherical, compact grains. We calculate the optical dust properties derived via the fast, but simple effective medium theory. The theoretical lower boundary of the size distribution -- the so-called 'blowout size' -- is compared in the cases of compact and porous grains. Finally, we simulate observations of hypothetical debris discs with different porosities and feed them into a fitting procedure using only compact grains. The deviations of the results for compact grains from the original model based on porous grains are analysed. We find that the blowout size increases with increasing grain porosity up to a factor of two. An analytical approximation function for the blowout size as a function of porosity and stellar luminosity is derived. The analysis of the geometrical disc set-up, when constrained by radial profiles, are barely affected by the porosity. However, the determined minimum grain size and the slope of the grain size distribution derived using compact grains are significantly overestimated. Thus, the unexpectedly high ratio of minimum grain size to blowout size found by previous studies using compact grains can be partially described by dust grain porosity, although the effect is not strong enough to completely explain the trend.Comment: accepted by MNRA

    Science-Technology Division

    Get PDF

    Bimodal gene expression and biomarker discovery.

    Get PDF
    With insights gained through molecular profiling, cancer is recognized as a heterogeneous disease with distinct subtypes and outcomes that can be predicted by a limited number of biomarkers. Statistical methods such as supervised classification and machine learning identify distinguishing features associated with disease subtype but are not necessarily clear or interpretable on a biological level. Genes with bimodal transcript expression, however, may serve as excellent candidates for disease biomarkers with each mode of expression readily interpretable as a biological state. The recent article by Wang et al, entitled The Bimodality Index: A Criterion for Discovering and Ranking Bimodal Signatures from Cancer Gene Expression Profiling Data, provides a bimodality index for identifying and scoring transcript expression profiles as biomarker candidates with the benefit of having a direct relation to power and sample size. This represents an important step in candidate biomarker discovery that may help streamline the pipeline through validation and clinical application

    Inner mean-motion resonances with eccentric planets: A possible origin for exozodiacal dust clouds

    Full text link
    High levels of dust have been detected in the immediate vicinity of many stars, both young and old. A promising scenario to explain the presence of this short-lived dust is that these analogues to the Zodiacal cloud (or exozodis) are refilled in situ through cometary activity and sublimation. As the reservoir of comets is not expected to be replenished, the presence of these exozodis in old systems has yet to be adequately explained. It was recently suggested that mean-motion resonances (MMR) with exterior planets on moderately eccentric (ep≳0.1\mathrm{e_p}\gtrsim 0.1) orbits could scatter planetesimals on to cometary orbits with delays of the order of several 100 Myr. Theoretically, this mechanism is also expected to sustain continuous production of active comets once it has started, potentially over Gyr-timescales. We aim here to investigate the ability of this mechanism to generate scattering on to cometary orbits compatible with the production of an exozodi on long timescales. We combine analytical predictions and complementary numerical N-body simulations to study its characteristics. We show, using order of magnitude estimates, that via this mechanism, low mass discs comparable to the Kuiper Belt could sustain comet scattering at rates compatible with the presence of the exozodis which are detected around Solar-type stars, and on Gyr timescales. We also find that the levels of dust detected around Vega could be sustained via our proposed mechanism if an eccentric Jupiter-like planet were present exterior to the system's cold debris disc.Comment: 15 pages, 12 figures; Accepted for publication in MNRA

    iSeqQC: a tool for expression-based quality control in RNA sequencing.

    Get PDF
    BACKGROUND: Quality Control in any high-throughput sequencing technology is a critical step, which if overlooked can compromise an experiment and the resulting conclusions. A number of methods exist to identify biases during sequencing or alignment, yet not many tools exist to interpret biases due to outliers. RESULTS: Hence, we developed iSeqQC, an expression-based QC tool that detects outliers either produced due to variable laboratory conditions or due to dissimilarity within a phenotypic group. iSeqQC implements various statistical approaches including unsupervised clustering, agglomerative hierarchical clustering and correlation coefficients to provide insight into outliers. It can be utilized through command-line (Github: https://github.com/gkumar09/iSeqQC) or web-interface (http://cancerwebpa.jefferson.edu/iSeqQC). A local shiny installation can also be obtained from github (https://github.com/gkumar09/iSeqQC). CONCLUSION: iSeqQC is a fast, light-weight, expression-based QC tool that detects outliers by implementing various statistical approaches

    Towards Implicit Parallel Programming for Systems

    Get PDF
    Multi-core processors require a program to be decomposable into independent parts that can execute in parallel in order to scale performance with the number of cores. But parallel programming is hard especially when the program requires state, which many system programs use for optimization, such as for example a cache to reduce disk I/O. Most prevalent parallel programming models do not support a notion of state and require the programmer to synchronize state access manually, i.e., outside the realms of an associated optimizing compiler. This prevents the compiler to introduce parallelism automatically and requires the programmer to optimize the program manually. In this dissertation, we propose a programming language/compiler co-design to provide a new programming model for implicit parallel programming with state and a compiler that can optimize the program for a parallel execution. We define the notion of a stateful function along with their composition and control structures. An example implementation of a highly scalable server shows that stateful functions smoothly integrate into existing programming language concepts, such as object-oriented programming and programming with structs. Our programming model is also highly practical and allows to gradually adapt existing code bases. As a case study, we implemented a new data processing core for the Hadoop Map/Reduce system to overcome existing performance bottlenecks. Our lambda-calculus-based compiler automatically extracts parallelism without changing the program's semantics. We added further domain-specific semantic-preserving transformations that reduce I/O calls for microservice programs. The runtime format of a program is a dataflow graph that can be executed in parallel, performs concurrent I/O and allows for non-blocking live updates

    Why Servant Leadership?

    Get PDF
    • …
    corecore