59,852 research outputs found
PPF - A Parallel Particle Filtering Library
We present the parallel particle filtering (PPF) software library, which
enables hybrid shared-memory/distributed-memory parallelization of particle
filtering (PF) algorithms combining the Message Passing Interface (MPI) with
multithreading for multi-level parallelism. The library is implemented in Java
and relies on OpenMPI's Java bindings for inter-process communication. It
includes dynamic load balancing, multi-thread balancing, and several
algorithmic improvements for PF, such as input-space domain decomposition. The
PPF library hides the difficulties of efficient parallel programming of PF
algorithms and provides application developers with the necessary tools for
parallel implementation of PF methods. We demonstrate the capabilities of the
PPF library using two distributed PF algorithms in two scenarios with different
numbers of particles. The PPF library runs a 38 million particle problem,
corresponding to more than 1.86 GB of particle data, on 192 cores with 67%
parallel efficiency. To the best of our knowledge, the PPF library is the first
open-source software that offers a parallel framework for PF applications.Comment: 8 pages, 8 figures; will appear in the proceedings of the IET Data
Fusion & Target Tracking Conference 201
Review of the mathematical foundations of data fusion techniques in surface metrology
The recent proliferation of engineered surfaces, including freeform and structured surfaces, is challenging current metrology techniques. Measurement using multiple sensors has been proposed to achieve enhanced benefits, mainly in terms of spatial frequency bandwidth, which a single sensor cannot provide. When using data from different sensors, a process of data fusion is required and there is much active research in this area. In this paper, current data fusion methods and applications are reviewed, with a focus on the mathematical foundations of the subject. Common research questions in the fusion of surface metrology data are raised and potential fusion algorithms are discussed
Super-resolution in turbulent videos: making profit from damage
It is shown that one can make use of local instabilities in turbulent video
frames to enhance image resolution beyond the limit defined by the image
sampling rate. The paper outlines the processing algorithm, presents its
experimental verification on simulated and real-life videos and discusses its
potentials and limitations.Comment: 11 pages, 2 figures. Submitted to Optics Letters, 10-07-0
The Data Big Bang and the Expanding Digital Universe: High-Dimensional, Complex and Massive Data Sets in an Inflationary Epoch
Recent and forthcoming advances in instrumentation, and giant new surveys,
are creating astronomical data sets that are not amenable to the methods of
analysis familiar to astronomers. Traditional methods are often inadequate not
merely because of the size in bytes of the data sets, but also because of the
complexity of modern data sets. Mathematical limitations of familiar algorithms
and techniques in dealing with such data sets create a critical need for new
paradigms for the representation, analysis and scientific visualization (as
opposed to illustrative visualization) of heterogeneous, multiresolution data
across application domains. Some of the problems presented by the new data sets
have been addressed by other disciplines such as applied mathematics,
statistics and machine learning and have been utilized by other sciences such
as space-based geosciences. Unfortunately, valuable results pertaining to these
problems are mostly to be found only in publications outside of astronomy. Here
we offer brief overviews of a number of concepts, techniques and developments,
some "old" and some new. These are generally unknown to most of the
astronomical community, but are vital to the analysis and visualization of
complex datasets and images. In order for astronomers to take advantage of the
richness and complexity of the new era of data, and to be able to identify,
adopt, and apply new solutions, the astronomical community needs a certain
degree of awareness and understanding of the new concepts. One of the goals of
this paper is to help bridge the gap between applied mathematics, artificial
intelligence and computer science on the one side and astronomy on the other.Comment: 24 pages, 8 Figures, 1 Table. Accepted for publication: "Advances in
Astronomy, special issue "Robotic Astronomy
Automatic Brain Tumor Segmentation using Cascaded Anisotropic Convolutional Neural Networks
A cascade of fully convolutional neural networks is proposed to segment
multi-modal Magnetic Resonance (MR) images with brain tumor into background and
three hierarchical regions: whole tumor, tumor core and enhancing tumor core.
The cascade is designed to decompose the multi-class segmentation problem into
a sequence of three binary segmentation problems according to the subregion
hierarchy. The whole tumor is segmented in the first step and the bounding box
of the result is used for the tumor core segmentation in the second step. The
enhancing tumor core is then segmented based on the bounding box of the tumor
core segmentation result. Our networks consist of multiple layers of
anisotropic and dilated convolution filters, and they are combined with
multi-view fusion to reduce false positives. Residual connections and
multi-scale predictions are employed in these networks to boost the
segmentation performance. Experiments with BraTS 2017 validation set show that
the proposed method achieved average Dice scores of 0.7859, 0.9050, 0.8378 for
enhancing tumor core, whole tumor and tumor core, respectively. The
corresponding values for BraTS 2017 testing set were 0.7831, 0.8739, and
0.7748, respectively.Comment: 12 pages, 5 figures. MICCAI Brats Challenge 201
Recommended from our members
Highly efficient transfection of human induced pluripotent stem cells using magnetic nanoparticles.
PurposeThe delivery of transgenes into human induced pluripotent stem cell (hiPSC)-derived cardiomyocytes (hiPSC-CMs) represents an important tool in cardiac regeneration with potential for clinical applications. Gene transfection is more difficult, however, for hiPSCs and hiPSC-CMs than for somatic cells. Despite improvements in transfection and transduction, the efficiency, cytotoxicity, safety, and cost of these methods remain unsatisfactory. The objective of this study is to examine gene transfection in hiPSCs and hiPSC-CMs using magnetic nanoparticles (NPs).MethodsMagnetic NPs are unique transfection reagents that form complexes with nucleic acids by ionic interaction. The particles, loaded with nucleic acids, can be guided by a magnetic field to allow their concentration onto the surface of the cell membrane. Subsequent uptake of the loaded particles by the cells allows for high efficiency transfection of the cells with nucleic acids. We developed a new method using magnetic NPs to transfect hiPSCs and hiPSC-CMs. HiPSCs and hiPSC-CMs were cultured and analyzed using confocal microscopy, flow cytometry, and patch clamp recordings to quantify the transfection efficiency and cellular function.ResultsWe compared the transfection efficiency of hiPSCs with that of human embryonic kidney (HEK 293) cells. We observed that the average efficiency in hiPSCs was 43%±2% compared to 62%±4% in HEK 293 cells. Further analysis of the transfected hiPSCs showed that the differentiation of hiPSCs to hiPSC-CMs was not altered by NPs. Finally, robust transfection of hiPSC-CMs with an efficiency of 18%±2% was obtained.ConclusionThe difficult-to-transfect hiPSCs and hiPSC-CMs were efficiently transfected using magnetic NPs. Our study offers a novel approach for transfection of hiPSCs and hiPSC-CMs without the need for viral vector generation
- …