1,010 research outputs found
Analysis of Computer Science Communities Based on DBLP
It is popular nowadays to bring techniques from bibliometrics and
scientometrics into the world of digital libraries to analyze the collaboration
patterns and explore mechanisms which underlie community development. In this
paper we use the DBLP data to investigate the author's scientific career and
provide an in-depth exploration of some of the computer science communities. We
compare them in terms of productivity, population stability and collaboration
trends.Besides we use these features to compare the sets of topranked
conferences with their lower ranked counterparts.Comment: 9 pages, 7 figures, 6 table
Spectrum Sharing in Wireless Networks via QoS-Aware Secondary Multicast Beamforming
Secondary spectrum usage has the potential to considerably increase spectrum utilization. In this paper, quality-of-service (QoS)-aware spectrum underlay of a secondary multicast network is considered. A multiantenna secondary access point (AP) is used for multicast (common information) transmission to a number of secondary single-antenna receivers. The idea is that beamforming can be used to steer power towards the secondary receivers while limiting sidelobes that cause interference to primary receivers. Various optimal formulations of beamforming are proposed, motivated by different ldquocohabitationrdquo scenarios, including robust designs that are applicable with inaccurate or limited channel state information at the secondary AP. These formulations are NP-hard computational problems; yet it is shown how convex approximation-based multicast beamforming tools (originally developed without regard to primary interference constraints) can be adapted to work in a spectrum underlay context. Extensive simulation results demonstrate the effectiveness of the proposed approaches and provide insights on the tradeoffs between different design criteria
Constant-factor approximations for asymmetric TSP on nearly-embeddable graphs
In the Asymmetric Traveling Salesperson Problem (ATSP) the goal is to find a
closed walk of minimum cost in a directed graph visiting every vertex. We
consider the approximability of ATSP on topologically restricted graphs. It has
been shown by [Oveis Gharan and Saberi 2011] that there exists polynomial-time
constant-factor approximations on planar graphs and more generally graphs of
constant orientable genus. This result was extended to non-orientable genus by
[Erickson and Sidiropoulos 2014].
We show that for any class of \emph{nearly-embeddable} graphs, ATSP admits a
polynomial-time constant-factor approximation. More precisely, we show that for
any fixed , there exist , such that ATSP on
-vertex -nearly-embeddable graphs admits a -approximation in time
. The class of -nearly-embeddable graphs contains graphs with at
most apices, vortices of width at most , and an underlying surface
of either orientable or non-orientable genus at most . Prior to our work,
even the case of graphs with a single apex was open. Our algorithm combines
tools from rounding the Held-Karp LP via thin trees with dynamic programming.
We complement our upper bounds by showing that solving ATSP exactly on graphs
of pathwidth (and hence on -nearly embeddable graphs) requires time
, assuming the Exponential-Time Hypothesis (ETH). This is
surprising in light of the fact that both TSP on undirected graphs and Minimum
Cost Hamiltonian Cycle on directed graphs are FPT parameterized by treewidth
CloudFCN: Accurate and robust cloud detection for satellite imagery with deep learning
Cloud masking is of central importance to the Earth Observation community. This paper deals with the problem of detecting clouds in visible and multispectral imagery from high-resolution satellite cameras. Recently, Machine Learning has offered promising solutions to the problem of cloud masking, allowing for more flexibility than traditional thresholding techniques, which are restricted to instruments with the requisite spectral bands. However, few studies use multi-scale features (as in, a combination of pixel-level and spatial) whilst also offering compelling experimental evidence for real-world performance. Therefore, we introduce CloudFCN, based on a Fully Convolutional Network architecture, known as U-net, which has become a standard Deep Learning approach to image segmentation. It fuses the shallowest and deepest layers of the network, thus routing low-level visible content to its deepest layers. We offer an extensive range of experiments on this, including data from two high-resolution sensors-Carbonite-2 and Landsat 8-and several complementary tests. Owing to a variety of performance-enhancing design choices and training techniques, it exhibits state-of-the-art performance where comparable to other methods, high speed, and robustness to many different terrains and sensor types
Fat polygonal partitions with applications to visualization and embeddings
Let T be a rooted and weighted tree, where the weight of any node is equal to the sum of the weights of its children. The popular Treemap algorithm visualizes such a tree as a hierarchical partition of a square into rectangles, where the area of the rectangle corresponding to any node in T is equal to the weight of that node. The aspect ratio of the rectangles in such a rectangular partition necessarily depends on the weights and can become arbitrarily high. We introduce a new hierarchical partition scheme, called a polygonal partition, which uses convex polygons rather than just rectangles. We present two methods for constructing polygonal partitions, both having guarantees on the worst-case aspect ratio of the constructed polygons; in particular, both methods guarantee a bound on the aspect ratio that is independent of the weights of the nodes. We also consider rectangular partitions with slack, where the areas of the rectangles may differ slightly from the weights of the corresponding nodes. We show that this makes it possible to obtain partitions with constant aspect ratio. This result generalizes to hyper-rectangular partitions in Rd. We use these partitions with slack for embedding ultrametrics into d-dimensional Euclidean space: we give a polylog(¿)-approximation algorithm for embedding n-point ultrametrics into Rd with minimum distortion, where ¿ denotes the spread of the metric. The previously best-known approximation ratio for this problem was polynomial in n. This is the first algorithm for embedding a non-trivial family of weighted-graph metrics into a space of constant dimension that achieves polylogarithmic approximation ratio
Metabolic syndrome in rheumatic diseases: epidemiology, pathophysiology, and clinical implications
Subjects with metabolic syndrome–a constellation of cardiovascular risk factors of which central obesity and insulin resistance are the most characteristic–are at increased risk for developing diabetes mellitus and cardiovascular disease. In these subjects, abdominal adipose tissue is a source of inflammatory cytokines such as tumor necrosis factor-alpha, known to promote insulin resistance. The presence of inflammatory cytokines together with the well-documented increased risk for cardiovascular diseases in patients with inflammatory arthritides and systemic lupus erythematosus has prompted studies to examine the prevalence of the metabolic syndrome in an effort to identify subjects at risk in addition to that conferred by traditional cardiovascular risk factors. These studies have documented a high prevalence of metabolic syndrome which correlates with disease activity and markers of atherosclerosis. The correlation of inflammatory disease activity with metabolic syndrome provides additional evidence for a link between inflammation and metabolic disturbances/vascular morbidity
SEnSeI: A Deep Learning Module for Creating Sensor Independent Cloud Masks
We introduce a novel neural network architecture -- Spectral ENcoder for SEnsor Independence (SEnSeI) -- by which several multispectral instruments, each with different combinations of spectral bands, can be used to train a generalised deep learning model. We focus on the problem of cloud masking, using several pre-existing datasets, and a new, freely available dataset for Sentinel-2. Our model is shown to achieve state-of-the-art performance on the satellites it was trained on (Sentinel-2 and Landsat 8), and is able to extrapolate to sensors it has not seen during training such as Landsat 7, Per\'uSat-1, and Sentinel-3 SLSTR. Model performance is shown to improve when multiple satellites are used in training, approaching or surpassing the performance of specialised, single-sensor models. This work is motivated by the fact that the remote sensing community has access to data taken with a hugely variety of sensors. This has inevitably led to labelling efforts being undertaken separately for different sensors, which limits the performance of deep learning models, given their need for huge training sets to perform optimally. Sensor independence can enable deep learning models to utilise multiple datasets for training simultaneously, boosting performance and making them much more widely applicable. This may lead to deep learning approaches being used more frequently for on-board applications and in ground segment data processing, which generally require models to be ready at launch or soon afterwards
Generalized h-index for Disclosing Latent Facts in Citation Networks
What is the value of a scientist and its impact upon the scientific thinking?
How can we measure the prestige of a journal or of a conference? The evaluation
of the scientific work of a scientist and the estimation of the quality of a
journal or conference has long attracted significant interest, due to the
benefits from obtaining an unbiased and fair criterion. Although it appears to
be simple, defining a quality metric is not an easy task. To overcome the
disadvantages of the present metrics used for ranking scientists and journals,
J.E. Hirsch proposed a pioneering metric, the now famous h-index. In this
article, we demonstrate several inefficiencies of this index and develop a pair
of generalizations and effective variants of it to deal with scientist ranking
and with publication forum ranking. The new citation indices are able to
disclose trendsetters in scientific research, as well as researchers that
constantly shape their field with their influential work, no matter how old
they are. We exhibit the effectiveness and the benefits of the new indices to
unfold the full potential of the h-index, with extensive experimental results
obtained from DBLP, a widely known on-line digital library.Comment: 19 pages, 17 tables, 27 figure
- …