21,581 research outputs found
A Comparison and Strategy of Semantic Segmentation on Remote Sensing Images
In recent years, with the development of aerospace technology, we use more
and more images captured by satellites to obtain information. But a large
number of useless raw images, limited data storage resource and poor
transmission capability on satellites hinder our use of valuable images.
Therefore, it is necessary to deploy an on-orbit semantic segmentation model to
filter out useless images before data transmission. In this paper, we present a
detailed comparison on the recent deep learning models. Considering the
computing environment of satellites, we compare methods from accuracy,
parameters and resource consumption on the same public dataset. And we also
analyze the relation between them. Based on experimental results, we further
propose a viable on-orbit semantic segmentation strategy. It will be deployed
on the TianZhi-2 satellite which supports deep learning methods and will be
lunched soon.Comment: 8 pages, 3 figures, ICNC-FSKD 201
Diverse Diversions: Youth Justice Reform, Localized Practices, and a ‘New Interventionist Diversion’?
The recent resurgence of practices aimed at ‘diverting’ young people from prosecution appears to suggest a sea change from the interventionism which characterized New Labour’s approach to young law-breakers. Drawing on interviews with youth justice practitioners at two sites in England, we argue this is overly simplistic, since the ‘interventionist diversion’ they describe reflects the continued influence of New Labour reforms, as well as older approaches. We conclude that more empirical research is needed to establish where such interventions sit within the broader – and increasingly localized – landscape of support provision, as well as the consequences of providing ‘welfare’ in this way
Geometry meets semantics for semi-supervised monocular depth estimation
Depth estimation from a single image represents a very exciting challenge in
computer vision. While other image-based depth sensing techniques leverage on
the geometry between different viewpoints (e.g., stereo or structure from
motion), the lack of these cues within a single image renders ill-posed the
monocular depth estimation task. For inference, state-of-the-art
encoder-decoder architectures for monocular depth estimation rely on effective
feature representations learned at training time. For unsupervised training of
these models, geometry has been effectively exploited by suitable images
warping losses computed from views acquired by a stereo rig or a moving camera.
In this paper, we make a further step forward showing that learning semantic
information from images enables to improve effectively monocular depth
estimation as well. In particular, by leveraging on semantically labeled images
together with unsupervised signals gained by geometry through an image warping
loss, we propose a deep learning approach aimed at joint semantic segmentation
and depth estimation. Our overall learning framework is semi-supervised, as we
deploy groundtruth data only in the semantic domain. At training time, our
network learns a common feature representation for both tasks and a novel
cross-task loss function is proposed. The experimental findings show how,
jointly tackling depth prediction and semantic segmentation, allows to improve
depth estimation accuracy. In particular, on the KITTI dataset our network
outperforms state-of-the-art methods for monocular depth estimation.Comment: 16 pages, Accepted to ACCV 201
SSH adequacy to preimplantation mammalian development: Scarce specific transcripts cloning despite irregular normalisation
BACKGROUND: SSH has emerged as a widely used technology to identify genes that are differentially regulated between two biological situations. Because it includes a normalisation step, it is used for preference to clone low abundance differentially expressed transcripts. It does not require previous sequence knowledge and may start from PCR amplified cDNAs. It is thus particularly well suited to biological situations where specific genes are expressed and tiny amounts of RNA are available. This is the case during early mammalian embryo development. In this field, few differentially expressed genes have been characterized from SSH libraries, but an overall assessment of the quality of SSH libraries is still required. Because we are interested in the more systematic establishment of SSH libraries from early embryos, we have developed a simple and reliable strategy based on reporter transcript follow-up to check SSH library quality and repeatability when starting with small amounts of RNA. RESULTS: Four independent subtracted libraries were constructed. They aimed to analyze key events in the preimplantation development of rabbit and bovine embryos. The performance of the SSH procedure was assessed through the large-scale screening of thousands of clones from each library for exogenous reporter transcripts mimicking either tester specific or tester/driver common transcripts. Our results show that abundant transcripts escape normalisation which is only efficient for rare and moderately abundant transcripts. Sequencing 1600 clones from one of the libraries confirmed and extended our results to endogenous transcripts and demonstrated that some very abundant transcripts common to tester and driver escaped subtraction. Nonetheless, the four libraries were greatly enriched in clones encoding for very rare (0.0005% of mRNAs) tester-specific transcripts. CONCLUSION: The close agreement between our hybridization and sequencing results shows that the addition and follow-up of exogenous reporter transcripts provides an easy and reliable means to check SSH performance. Despite some cases of irregular normalisation and subtraction failure, we have shown that SSH repeatedly enriches the libraries in very rare, tester-specific transcripts, and can thus be considered as a powerful tool to investigate situations where small amounts of biological material are available, such as during early mammalian development
Local Volatility Calibration by Optimal Transport
The calibration of volatility models from observable option prices is a
fundamental problem in quantitative finance. The most common approach among
industry practitioners is based on the celebrated Dupire's formula [6], which
requires the knowledge of vanilla option prices for a continuum of strikes and
maturities that can only be obtained via some form of price interpolation. In
this paper, we propose a new local volatility calibration technique using the
theory of optimal transport. We formulate a time continuous martingale optimal
transport problem, which seeks a martingale diffusion process that matches the
known densities of an asset price at two different dates, while minimizing a
chosen cost function. Inspired by the seminal work of Benamou and Brenier [1],
we formulate the problem as a convex optimization problem, derive its dual
formulation, and solve it numerically via an augmented Lagrangian method and
the alternative direction method of multipliers (ADMM) algorithm. The solution
effectively reconstructs the dynamic of the asset price between the two dates
by recovering the optimal local volatility function, without requiring any time
interpolation of the option prices
Degradation of metaldehyde in water by nanoparticle catalysts and powdered activated carbon
Metaldehyde, an organic pesticide widely used in the UK, has been detected in drinking water in the UK with a low concentration (<1 μg L−1) which is still above the European and UK standard requirements. This paper investigates the efficiency of four materials: powdered activated carbon (PAC) and carbon-doped titanium dioxide nanocatalyst with different concentrations of carbon (C-1.5, C-40, and C-80) for metaldehyde removal from aqueous solutions by adsorption and oxidation via photocatalysis. PAC was found to be the most effective material which showed almost over 90% removal. Adsorption data were well fitted to the Langmuir isotherm model, giving a qm (maximum/saturation adsorption capacity) value of 32.258 mg g−1 and a KL (Langmuir constant) value of 2.013 L mg−1. In terms of kinetic study, adsorption of metaldehyde by PAC fitted well with a pseudo-second-order equation, giving the adsorption rate constant k2 value of 0.023 g mg−1 min−1, implying rapid adsorption. The nanocatalysts were much less effective in oxidising metaldehyde than PAC with the same metaldehyde concentration and 0.2 g L−1 loading concentration of materials under UV light; the maximum removal achieved by carbon-doped titanium dioxide (C-1.5) nanocatalyst was around 15% for a 7.5 ppm metaldehyde solution
Supermassive black holes do not correlate with dark matter halos of galaxies
Supermassive black holes have been detected in all galaxies that contain
bulge components when the galaxies observed were close enough so that the
searches were feasible. Together with the observation that bigger black holes
live in bigger bulges, this has led to the belief that black hole growth and
bulge formation regulate each other. That is, black holes and bulges
"coevolve". Therefore, reports of a similar correlation between black holes and
the dark matter halos in which visible galaxies are embedded have profound
implications. Dark matter is likely to be nonbaryonic, so these reports suggest
that unknown, exotic physics controls black hole growth. Here we show - based
in part on recent measurements of bulgeless galaxies - that there is almost no
correlation between dark matter and parameters that measure black holes unless
the galaxy also contains a bulge. We conclude that black holes do not correlate
directly with dark matter. They do not correlate with galaxy disks, either.
Therefore black holes coevolve only with bulges. This simplifies the puzzle of
their coevolution by focusing attention on purely baryonic processes in the
galaxy mergers that make bulges.Comment: 12 pages, 9 Postscript figures, 1 table; published in Nature (20
January 2011
A predicted astrometric microlensing event by a nearby white dwarf
We used the Tycho-Gaia Astrometric Solution catalogue, part of the Gaia Data
Release 1, to search for candidate astrometric microlensing events expected to
occur within the remaining lifetime of the Gaia satellite. Our search yielded
one promising candidate. We predict that the nearby DQ type white dwarf LAWD 37
(WD 1142-645) will lens a background star and will reach closest approach on
November 11th 2019 ( 4 days) with impact parameter mas. This
will produce an apparent maximum deviation of the source position of
mas. In the most propitious circumstance, Gaia will be able to
determine the mass of LAWD 37 to . This mass determination will
provide an independent check on atmospheric models of white dwarfs with helium
rich atmospheres, as well as tests of white dwarf mass radius relationships and
evolutionary theory
Random qubit-states and how best to measure them
We consider the problem of measuring a single qubit, known to have been prepared in either a randomly selected pure state or a randomly selected real pure state. We seek the measurements that provide either the best estimate of the state prepared or maximise the accessible information. Surprisingly, any sensible measurement turns out to be optimal. We discuss the application of these ideas to multiple qubits and higher-dimensional systems
An Edgeworth expansion for finite population L-statistics
In this paper, we consider the one-term Edgeworth expansion for finite
population L-statistics. We provide an explicit formula for the Edgeworth
correction term and give sufficient conditions for the validity of the
expansion which are expressed in terms of the weight function that defines the
statistics and moment conditions.Comment: 14 pages. Minor revisions. Some explanatory comments and a numerical
example were added. Lith. Math. J. (to appear
- …