298 research outputs found
Adaptive Tag Selection for Image Annotation
Not all tags are relevant to an image, and the number of relevant tags is
image-dependent. Although many methods have been proposed for image
auto-annotation, the question of how to determine the number of tags to be
selected per image remains open. The main challenge is that for a large tag
vocabulary, there is often a lack of ground truth data for acquiring optimal
cutoff thresholds per tag. In contrast to previous works that pre-specify the
number of tags to be selected, we propose in this paper adaptive tag selection.
The key insight is to divide the vocabulary into two disjoint subsets, namely a
seen set consisting of tags having ground truth available for optimizing their
thresholds and a novel set consisting of tags without any ground truth. Such a
division allows us to estimate how many tags shall be selected from the novel
set according to the tags that have been selected from the seen set. The
effectiveness of the proposed method is justified by our participation in the
ImageCLEF 2014 image annotation task. On a set of 2,065 test images with ground
truth available for 207 tags, the benchmark evaluation shows that compared to
the popular top- strategy which obtains an F-score of 0.122, adaptive tag
selection achieves a higher F-score of 0.223. Moreover, by treating the
underlying image annotation system as a black box, the new method can be used
as an easy plug-in to boost the performance of existing systems
AutoCompress: An Automatic DNN Structured Pruning Framework for Ultra-High Compression Rates
Structured weight pruning is a representative model compression technique of
DNNs to reduce the storage and computation requirements and accelerate
inference. An automatic hyperparameter determination process is necessary due
to the large number of flexible hyperparameters. This work proposes
AutoCompress, an automatic structured pruning framework with the following key
performance improvements: (i) effectively incorporate the combination of
structured pruning schemes in the automatic process; (ii) adopt the
state-of-art ADMM-based structured weight pruning as the core algorithm, and
propose an innovative additional purification step for further weight reduction
without accuracy loss; and (iii) develop effective heuristic search method
enhanced by experience-based guided search, replacing the prior deep
reinforcement learning technique which has underlying incompatibility with the
target pruning problem. Extensive experiments on CIFAR-10 and ImageNet datasets
demonstrate that AutoCompress is the key to achieve ultra-high pruning rates on
the number of weights and FLOPs that cannot be achieved before. As an example,
AutoCompress outperforms the prior work on automatic model compression by up to
33x in pruning rate (120x reduction in the actual parameter count) under the
same accuracy. Significant inference speedup has been observed from the
AutoCompress framework on actual measurements on smartphone. We release all
models of this work at anonymous link: http://bit.ly/2VZ63dS
ResNorm: Tackling Long-tailed Degree Distribution Issue in Graph Neural Networks via Normalization
Graph Neural Networks (GNNs) have attracted much attention due to their
ability in learning representations from graph-structured data. Despite the
successful applications of GNNs in many domains, the optimization of GNNs is
less well studied, and the performance on node classification heavily suffers
from the long-tailed node degree distribution. This paper focuses on improving
the performance of GNNs via normalization.
In detail, by studying the long-tailed distribution of node degrees in the
graph, we propose a novel normalization method for GNNs, which is termed
ResNorm (\textbf{Res}haping the long-tailed distribution into a normal-like
distribution via \textbf{norm}alization). The operation of ResNorm
reshapes the node-wise standard deviation (NStd) distribution so as to improve
the accuracy of tail nodes (\textit{i}.\textit{e}., low-degree nodes). We
provide a theoretical interpretation and empirical evidence for understanding
the mechanism of the above . In addition to the long-tailed distribution
issue, over-smoothing is also a fundamental issue plaguing the community. To
this end, we analyze the behavior of the standard shift and prove that the
standard shift serves as a preconditioner on the weight matrix, increasing the
risk of over-smoothing. With the over-smoothing issue in mind, we design a
operation for ResNorm that simulates the degree-specific parameter
strategy in a low-cost manner. Extensive experiments have validated the
effectiveness of ResNorm on several node classification benchmark datasets
A family with Robertsonian translocation: a potential mechanism of speciation in humans
Background: Robertsonian translocations occur in approximately one in every 1000 newborns. Although most Robertsonian translocation carriers are healthy and have a normal lifespan, they are at increased risk of spontaneous abortions and risk of producing unbalanced gametes and, therefore unbalanced offspring. Here we reported a previously undescribed Robertsonian translocation. Case Presentation: We identified three Robertsonian translocation carriers in this family. Two were heterozygous translocation carriers of 45, XX or XY, der(14;15)(q10;q10) and their son was a homozygous translocation carrier of a 44, XY, der(14;15)(q10;q10), der(14;15)(q10;q10) karyotype. Chromosomal analysis of sperm showed 99.7 % of sperm from the homozygous translocation carrier were normal/balanced while only 79.9 % of sperm from the heterozygous translocation carrier were normal/balanced. There was a significantly higher frequency of aneuploidy for sex chromosome in the heterozygous translocation carrier. Conclusions: The reproductive fitness of Robertsonian translocation carriers is reduced. Robertsonian translocation homozygosity can be a potential speciation in humans with 44 chromosomes.scientific research and technology development grant of Guangxi province [1598012-35]SCI(E)[email protected]; [email protected]
Copper-catalyzed methylative difunctionalization of alkenes
Trifluoromethylative difunctionalization and hydrofunctionalization of unactivated alkenes have been developed into powerful synthetic methodologies. On the other hand, methylative difunctionalization of olefins remains an unexplored research field. We report in this paper the Cu-catalyzed alkoxy methylation, azido methylation of alkenes using dicumyl peroxide (DCP), and di-tert-butyl peroxide (DTBP) as methyl sources. Using functionalized alkenes bearing a tethered nucleophile (alcohol, carboxylic acid, and sulfonamide), methylative cycloetherification, lactonization, and cycloamination processes are subsequently developed for the construction of important heterocycles such as 2,2-disubstituted tetrahydrofurans, tetrahydropyrans, γ-lactones, and pyrrolidines with concurrent generation of a quaternary carbon center. The results of control experiments suggest that the 1,2-alkoxy methylation of alkenes goes through a radical-cation crossover mechanism, whereas the 1,2-azido methylation proceeds via a radical addition and Cu-mediated azide transfer process
Recommended from our members
Fiber optic coherent laser radar 3D vision system
This CLVS will provide a substantial advance in high speed computer vision performance to support robotic Environmental Management (EM) operations. This 3D system employs a compact fiber optic based scanner and operator at a 128 x 128 pixel frame at one frame per second with a range resolution of 1 mm over its 1.5 meter working range. Using acousto-optic deflectors, the scanner is completely randomly addressable. This can provide live 3D monitoring for situations where it is necessary to update once per second. This can be used for decontamination and decommissioning operations in which robotic systems are altering the scene such as in waste removal, surface scarafacing, or equipment disassembly and removal. The fiber- optic coherent laser radar based system is immune to variations in lighting, color, or surface shading, which have plagued the reliability of existing 3D vision systems, while providing substantially superior range resolution
- …