10,433 research outputs found
The Evolution of Radio Loud Active Galactic Nuclei as a Function of Black Hole Spin
Recent work on the engines of active galactic nuclei jets suggests their
power depends strongly and perhaps counter-intuitively on black hole spin. We
explore the consequences of this on the radio-loud population of active
galactic nuclei and find that the time evolution of the most powerful radio
galaxies and radio-loud quasars fits into a picture in which black hole spin
varies from retrograde to prograde with respect to the accreting material.
Unlike the current view, according to which jet powers decrease in tandem with
a global downsizing effect, we argue for a drop in jet power resulting directly
from the paucity of retrograde accretion systems at lower redshift caused
by a continuous history of accretion dating back to higher . In addition,
the model provides simple interpretations for the basic spectral features
differentiating radio-loud and radio-quiet objects, such as the presence or
absence of disk reflection, broadened iron lines and signatures of disk winds.
We also briefly describe our models' interpretation of microquasar state
transitions. We highlight our result that the most radio-loud and most
radio-quiet objects both harbor highly spinning black holes but in retrograde
and prograde configurations, respectively.Comment: MNRAS accepte
Three-dimensional surface codes: Transversal gates and fault-tolerant architectures
One of the leading quantum computing architectures is based on the
two-dimensional (2D) surface code. This code has many advantageous properties
such as a high error threshold and a planar layout of physical qubits where
each physical qubit need only interact with its nearest neighbours. However,
the transversal logical gates available in 2D surface codes are limited. This
means that an additional (resource intensive) procedure known as magic state
distillation is required to do universal quantum computing with 2D surface
codes. Here, we examine three-dimensional (3D) surface codes in the context of
quantum computation. We introduce a picture for visualizing 3D surface codes
which is useful for analysing stacks of three 3D surface codes. We use this
picture to prove that the and gates are transversal in 3D surface
codes. We also generalize the techniques of 2D surface code lattice surgery to
3D surface codes. We combine these results and propose two quantum computing
architectures based on 3D surface codes. Magic state distillation is not
required in either of our architectures. Finally, we show that a stack of three
3D surface codes can be transformed into a single 3D color code (another type
of quantum error-correcting code) using code concatenation.Comment: 23 pages, 24 figures, v2: published versio
The Filament Sensor for Near Real-Time Detection of Cytoskeletal Fiber Structures
A reliable extraction of filament data from microscopic images is of high
interest in the analysis of acto-myosin structures as early morphological
markers in mechanically guided differentiation of human mesenchymal stem cells
and the understanding of the underlying fiber arrangement processes. In this
paper, we propose the filament sensor (FS), a fast and robust processing
sequence which detects and records location, orientation, length and width for
each single filament of an image, and thus allows for the above described
analysis. The extraction of these features has previously not been possible
with existing methods. We evaluate the performance of the proposed FS in terms
of accuracy and speed in comparison to three existing methods with respect to
their limited output. Further, we provide a benchmark dataset of real cell
images along with filaments manually marked by a human expert as well as
simulated benchmark images. The FS clearly outperforms existing methods in
terms of computational runtime and filament extraction accuracy. The
implementation of the FS and the benchmark database are available as open
source.Comment: 32 pages, 21 figure
Automated Segmentation of Cells with IHC Membrane Staining
This study presents a fully automated membrane segmentation technique for immunohistochemical tissue images with membrane staining, which is a critical task in computerized immunohistochemistry (IHC). Membrane segmentation is particularly tricky in immunohistochemical tissue images because the cellular membranes are visible only in the stained tracts of the cell, while the unstained tracts are not visible. Our automated method provides accurate segmentation of the cellular membranes in the stained tracts and reconstructs the approximate location of the unstained tracts using nuclear membranes as a spatial reference. Accurate cell-by-cell membrane segmentation allows per cell morphological analysis and quantification of the target membrane proteins that is fundamental in several medical applications such as cancer characterization and classification, personalized therapy design, and for any other applications requiring cell morphology characterization. Experimental results on real datasets from different anatomical locations demonstrate the wide applicability and high accuracy of our approach in the context of IHC analysi
Accelerating Reinforcement Learning by Composing Solutions of Automatically Identified Subtasks
This paper discusses a system that accelerates reinforcement learning by
using transfer from related tasks. Without such transfer, even if two tasks are
very similar at some abstract level, an extensive re-learning effort is
required. The system achieves much of its power by transferring parts of
previously learned solutions rather than a single complete solution. The system
exploits strong features in the multi-dimensional function produced by
reinforcement learning in solving a particular task. These features are stable
and easy to recognize early in the learning process. They generate a
partitioning of the state space and thus the function. The partition is
represented as a graph. This is used to index and compose functions stored in a
case base to form a close approximation to the solution of the new task.
Experiments demonstrate that function composition often produces more than an
order of magnitude increase in learning rate compared to a basic reinforcement
learning algorithm
Improving the performance and evaluation of computer-assisted semen analysis
Semen analysis is performed routinely in fertility clinics to analyze the quality of semen and sperm cells of male patients. The analysis is typically performed by trained technicians or by Computer-Assisted Semen Analysis (CASA) systems. Manual semen analysis performed by technicians is subjective, time-consuming, and laborious, and yet most fertility clinics perform semen analysis in this manner. CASA systems, which are designed to perform the same tasks automatically, have a considerable market share, yet many studies still express concerns about their accuracy and consistency. In this dissertation, the focus is on detection, tracking, and classification of sperm cells in semen images, key elements of CASA systems. The objective is to improve existing CASA algorithms and systems by applying validated computer vision, tracking, and computational intelligence algorithms.
The first step of the study is the development of simulation models for generating synthetic images of semen samples. The images enable the assessment of CASA systems and their algorithms. Specifically, the simulation models generate time-lapse images of semen samples for various sperm image categories and include ground truth labels. The models exploit standard image processing operations such as point spread functions and 2D convolutions, as well as new models of sperm cell swimming, developed for this study. They embody multiple studies of sperm motility in the form of parameterized motion equations. Use cases are presented to use the swimming models and the simulated images to assess and compare algorithms for sperm cell segmentation, localization, and tracking.
Second, a digital washing algorithm is presented for unwashed semen samples. Digital washing has the potential to replace the chemical washing techniques used by fertility clinics at present, which are costly, time-consuming, and unfriendly to the environment. The digital washing algorithm extracts features from moving sperm cells in an image, and uses these features to identify all sperm cells (moving and stationary) within each studied image (simulated or real). The effectiveness of the digital washing algorithm is demonstrated by comparing the performance of the proposed algorithm to other cell segmentation and detection techniques.
Third, a classification algorithm for sperm cells is developed, based on their swimming patterns. The classification algorithm uses K-means clustering on a subset of motility parameters of sperm cells selected by the Artificial Bee Colony (ABC) algorithm. Results of classification and clustering are shown, using simulated and real semen images. Swimming pattern classification has the potential to increase understanding of the relationship between the distribution of sperm cell swimming modes in a patient’s semen image and the fertility of that patient.
Lastly, a new method is presented to calculate motility parameters from sperm tracks. The movement of sperm cell is modeled as a sinusoidal traveling wave (“traveling sinusoid”). The amplitude and average path of a moving cell are estimated using an extended Kalman filter (EKF). The states estimated by the EKF include position, velocity, amplitude, and frequency of the traveling wave. The motility parameters calculated from this approach are shown to be superior to those calculated by other existing methods in terms of their accuracy and consistency.
CASA developers will find in this study (and in the software made available) new tools to improve the performance of their designs, and to compare and contrast different proposed approaches and algorithms
ExaViz: a Flexible Framework to Analyse, Steer and Interact with Molecular Dynamics Simulations
International audienceThe amount of data generated by molecular dynamics simulations of large molecular assemblies and the sheer size and complexity of the systems studied call for new ways to analyse, steer and interact with such calculations. Traditionally, the analysis is performed off-line once the huge amount of simulation results have been saved to disks, thereby stressing the supercomputer I/O systems, and making it increasingly difficult to handle post-processing and analysis from the scientist's office. The ExaViz framework is an alternative approach developed to couple the simulation with analysis tools to process the data as close as possible to their source of creation, saving a reduced, more manageable and pre-processed data set to disk. ExaViz supports a large variety of analysis and steering scenarios. Our framework can be used for live sessions (simulations short enough to be fully followed by the user) as well as batch sessions (long time batch executions). During interactive sessions, at run time, the user can display plots from analysis, visualise the molecular system and steer the simulation with a haptic device. We also emphasise how a Cave-like immersive environment could be used to leverage such simulations, offering a large display surface to view and intuitively navigate the molecular system
TopologyNet: Topology based deep convolutional neural networks for biomolecular property predictions
Although deep learning approaches have had tremendous success in image, video
and audio processing, computer vision, and speech recognition, their
applications to three-dimensional (3D) biomolecular structural data sets have
been hindered by the entangled geometric complexity and biological complexity.
We introduce topology, i.e., element specific persistent homology (ESPH), to
untangle geometric complexity and biological complexity. ESPH represents 3D
complex geometry by one-dimensional (1D) topological invariants and retains
crucial biological information via a multichannel image representation. It is
able to reveal hidden structure-function relationships in biomolecules. We
further integrate ESPH and convolutional neural networks to construct a
multichannel topological neural network (TopologyNet) for the predictions of
protein-ligand binding affinities and protein stability changes upon mutation.
To overcome the limitations to deep learning arising from small and noisy
training sets, we present a multitask topological convolutional neural network
(MT-TCNN). We demonstrate that the present TopologyNet architectures outperform
other state-of-the-art methods in the predictions of protein-ligand binding
affinities, globular protein mutation impacts, and membrane protein mutation
impacts.Comment: 20 pages, 8 figures, 5 table
Quasars: a supermassive rotating toroidal black hole interpretation
A supermassive rotating toroidal black hole (TBH) is proposed as the
fundamental structure of quasars and other jet-producing active galactic
nuclei. Rotating protogalaxies gather matter from the central gaseous region
leading to the birth of massive toroidal stars whose internal nuclear reactions
proceed very rapidly. Once the nuclear fuel is spent, gravitational collapse
produces a slender ring-shaped TBH remnant. These events are typically the
first supernovae of the host galaxies. Given time the TBH mass increases
through continued accretion by several orders of magnitude, the event horizon
swells whilst the central aperture shrinks. The difference in angular
velocities between the accreting matter and the TBH induces a magnetic field
that is strongest in the region of the central aperture and innermost
ergoregion. Due to the presence of negative energy states when such a
gravitational vortex is immersed in an electromagnetic field, circumstances are
near ideal for energy extraction via non-thermal radiation including the
Penrose process and superradiant scattering. This establishes a self-sustaining
mechanism whereby the transport of angular momentum away from the quasar by
relativistic bi-directional jets reinforces both the modulating magnetic field
and the TBH/accretion disk angular velocity differential. Quasar behaviour is
extinguished once the BH topology becomes spheroidal. Similar mechanisms may be
operating in microquasars, SNe and GRBs when neutron density or BH tori arise.
In certain circumstances, long-term TBH stability can be maintained by a
negative cosmological constant, otherwise the classical topology theorems must
somehow be circumvented. Preliminary evidence is presented that Planck-scale
quantum effects may be responsible.Comment: 26 pages, 14 figs, various corrections and enhancements, final
versio
- …