20,076 research outputs found
Seed, Expand and Constrain: Three Principles for Weakly-Supervised Image Segmentation
We introduce a new loss function for the weakly-supervised training of
semantic image segmentation models based on three guiding principles: to seed
with weak localization cues, to expand objects based on the information about
which classes can occur in an image, and to constrain the segmentations to
coincide with object boundaries. We show experimentally that training a deep
convolutional neural network using the proposed loss function leads to
substantially better segmentations than previous state-of-the-art methods on
the challenging PASCAL VOC 2012 dataset. We furthermore give insight into the
working mechanism of our method by a detailed experimental study that
illustrates how the segmentation quality is affected by each term of the
proposed loss function as well as their combinations.Comment: ECCV 201
Concentrations and sources of polycyclic aromatic hydrocarbons in surface coastal sediments of the northern Gulf of Mexico
Zucheng Wang is with the Department of Geography, Northeast Normal University, Changchun, China. -- Zucheng Wang and Zhanfei Liu are with the Marine Science Institute, The University of Texas at Austin, Port Aransas, TX, USA. -- Kehui Xu is with the Department of Oceanography and Coastal Sciences, Louisiana State University, Baton Rouge, LA, USA – and – the Coastal Studies Institute, Louisiana State University, Baton Rouge, LA, USA. -- Lawrence M Mayer is with the School of Marine Sciences, University of Maine, Walpole, ME, USA. -- Zulin Zhang is with The James Hutton Institute, Aberdeen, UK. -- Alexander S. Kolker is with Louisiana Universities Marine Consortium, Chauvin, LA, USA. -- Wei Wu is with the Department of Coastal Sciences, Gulf Coast Research Laboratory, The University of Southern Mississippi, Ocean Springs, MS, USA.Background: Coastal sediments in the northern Gulf of Mexico have a high potential of being contaminated by petroleum hydrocarbons, such as polycyclic aromatic hydrocarbons (PAHs), due to extensive petroleum exploration and transportation activities. In this study we evaluated the spatial distribution and contamination sources of PAHs, as well as the bioavailable fraction in the bulk PAH pool, in surface marsh and shelf sediments (top 5 cm) of the northern Gulf of Mexico. Results: PAH concentrations in this region ranged from 100 to 856 ng g−1, with the highest concentrations in Mississippi River mouth sediments followed by marsh sediments and then the lowest concentrations in shelf sediments. The PAH concentrations correlated positively with atomic C/N ratios of sedimentary organic matter (OM), suggesting that terrestrial OM preferentially sorbs PAHs relative to marine OM. PAHs with 2 rings were more abundant than those with 5–6 rings in continental shelf sediments, while the opposite was found in marsh sediments. This distribution pattern suggests different contamination sources between shelf and marsh sediments. Based on diagnostic ratios of PAH isomers and principal component analysis, shelf sediment PAHs were petrogenic and those from marsh sediments were pyrogenic. The proportions of bioavailable PAHs in total PAHs were low, ranging from 0.02% to 0.06%, with higher fractions found in marsh than shelf sediments.
Conclusion: PAH distribution and composition differences between marsh and shelf sediments were influenced by grain size, contamination sources, and the types of organic matter associated with PAHs. Concentrations of PAHs in the study area were below effects low-range, suggesting a low risk to organisms and limited transfer of PAHs into food web. From the source analysis, PAHs in shelf sediments mainly originated from direct petroleum contamination, while those in marsh sediments were from combustion of fossil fuels.Marine [email protected]
Detection-Recovery Gap for Planted Dense Cycles
Planted dense cycles are a type of latent structure that appears in many
applications, such as small-world networks in social sciences and sequence
assembly in computational biology. We consider a model where a dense cycle with
expected bandwidth and edge density is planted in an
Erd\H{o}s-R\'enyi graph . We characterize the computational thresholds
for the associated detection and recovery problems for the class of low-degree
polynomial algorithms. In particular, a gap exists between the two thresholds
in a certain regime of parameters. For example, if and for a constant , the detection problem
is computationally easy while the recovery problem is hard for low-degree
algorithms.Comment: 40 pages, 1 figur
Histogram-based models on non-thin section chest CT predict invasiveness of primary lung adenocarcinoma subsolid nodules.
109 pathologically proven subsolid nodules (SSN) were segmented by 2 readers on non-thin section chest CT with a lung nodule analysis software followed by extraction of CT attenuation histogram and geometric features. Functional data analysis of histograms provided data driven features (FPC1,2,3) used in further model building. Nodules were classified as pre-invasive (P1, atypical adenomatous hyperplasia and adenocarcinoma in situ), minimally invasive (P2) and invasive adenocarcinomas (P3). P1 and P2 were grouped together (T1) versus P3 (T2). Various combinations of features were compared in predictive models for binary nodule classification (T1/T2), using multiple logistic regression and non-linear classifiers. Area under ROC curve (AUC) was used as diagnostic performance criteria. Inter-reader variability was assessed using Cohen's Kappa and intra-class coefficient (ICC). Three models predicting invasiveness of SSN were selected based on AUC. First model included 87.5 percentile of CT lesion attenuation (Q.875), interquartile range (IQR), volume and maximum/minimum diameter ratio (AUC:0.89, 95%CI:[0.75 1]). Second model included FPC1, volume and diameter ratio (AUC:0.91, 95%CI:[0.77 1]). Third model included FPC1, FPC2 and volume (AUC:0.89, 95%CI:[0.73 1]). Inter-reader variability was excellent (Kappa:0.95, ICC:0.98). Parsimonious models using histogram and geometric features differentiated invasive from minimally invasive/pre-invasive SSN with good predictive performance in non-thin section CT
Can the Copernican principle be tested by cosmic neutrino background?
The Copernican principle, stating that we do not occupy any special place in
our universe, is usually taken for granted in modern cosmology. However recent
observational data of supernova indicate that we may live in the under-dense
center of our universe, which makes the Copernican principle challenged. It
thus becomes urgent and important to test the Copernican principle via
cosmological observations. Taking into account that unlike the cosmic photons,
the cosmic neutrinos of different energies come from the different places to us
along the different worldlines, we here propose cosmic neutrino background as a
test of the Copernican principle. It is shown that from the theoretical
perspective cosmic neutrino background can allow one to determine whether the
Copernican principle is valid or not, but to implement such an observation the
larger neutrino detectors are called for.Comment: JHEP style, 10 pages, 4 figures, version to appear in JCA
Recommended from our members
Trapping Hydrogen Atoms From a Neon-Gas Matrix: A Theoretical Simulation
Hydrogen is of critical importance in atomic and molecular physics and the development of a simple and efficient technique for trapping cold and ultracold hydrogen atoms would be a significant advance. In this study we simulate a recently proposed trap-loading mechanism for trapping hydrogen atoms released from a neon matrix. Accurate ab initio quantum calculations are reported of the neon-hydrogen interaction potential and the energy- and angular-dependent elastic scattering cross sections that control the energy transfer of initially cold atoms are obtained. They are then used to construct the Boltzmann kinetic equation, describing the energy relaxation process. Numerical solutions of the Boltzmann equation predict the time evolution of the hydrogen energy distribution function. Based on the simulations we discuss the prospects of the technique.Astronom
Recommended from our members
Relaxation of energetic S(1D) atoms in Xe gas: Comparison of ab initio calculations with experimental data
In this paper, we report our investigation of the translational energy relaxation of fast S((1)D) atoms in a Xe thermal bath. The interaction potential of Xe-S was constructed using ab initio methods. Total and differential cross sections were then calculated. The latter have been incorporated into the construction of the kernel of the Boltzmann equation describing the energy relaxation process. The solution of the Boltzmann equation was obtained and results were compared with those reported in experiments [G. Nan, and P. L. Houston, J. Chem. Phys. 97, 7865 (1992)]. Good agreement with the measured time-dependent relative velocity of fast S((1)D) atoms was obtained except at long relaxation times. The discrepancy may be due to the error accumulation caused by the use of hard sphere approximation and the Monte Carlo analysis of the experimental data. Our accurate description of the energy relaxation process led to an increase in the number of collisions required to achieve equilibrium by an order of magnitude compared to the number given by the hard-sphere approximation.Astronom
Gravitational Lensing by Wormholes
Gravitational lensing by traversable Lorentzian wormholes is a ew possibility
which is analyzed here in the strong field limit. Wormhole solutions are
considered in the Einstein minimally coupled theory and in the brane world
model. The observables in both the theories show significant differences from
those arising in the Schwarzschild black hole lensing. As a corollary, it
follows that wormholes with zero Keplerian mass exhibit lensing properties
which are qualitatively (though not quantitatively) the same as those of a
Schwarzschild black hole. Some special features of the considered solutions are
pointed out.Comment: 20 pages, no figure
- …