30,557 research outputs found
Link-Prediction Enhanced Consensus Clustering for Complex Networks
Many real networks that are inferred or collected from data are incomplete
due to missing edges. Missing edges can be inherent to the dataset (Facebook
friend links will never be complete) or the result of sampling (one may only
have access to a portion of the data). The consequence is that downstream
analyses that consume the network will often yield less accurate results than
if the edges were complete. Community detection algorithms, in particular,
often suffer when critical intra-community edges are missing. We propose a
novel consensus clustering algorithm to enhance community detection on
incomplete networks. Our framework utilizes existing community detection
algorithms that process networks imputed by our link prediction based
algorithm. The framework then merges their multiple outputs into a final
consensus output. On average our method boosts performance of existing
algorithms by 7% on artificial data and 17% on ego networks collected from
Facebook
icet - A Python library for constructing and sampling alloy cluster expansions
Alloy cluster expansions (CEs) provide an accurate and computationally
efficient mapping of the potential energy surface of multi-component systems
that enables comprehensive sampling of the many-dimensional configuration
space. Here, we introduce \textsc{icet}, a flexible, extensible, and
computationally efficient software package for the construction and sampling of
CEs. \textsc{icet} is largely written in Python for easy integration in
comprehensive workflows, including first-principles calculations for the
generation of reference data and machine learning libraries for training and
validation. The package enables training using a variety of linear regression
algorithms with and without regularization, Bayesian regression, feature
selection, and cross-validation. It also provides complementary functionality
for structure enumeration and mapping as well as data management and analysis.
Potential applications are illustrated by two examples, including the
computation of the phase diagram of a prototypical metallic alloy and the
analysis of chemical ordering in an inorganic semiconductor.Comment: 10 page
Unsupervised learning as a complement to convolutional neural network classification in the analysis of saccadic eye movement in spino-cerebellar ataxia type 2
IWANN es un congreso internacional que se celebra bienalmente desde 1991. Su campo de estudio se centra en la fundamentación y aplicación de las distintas técnicas de Inteligencia Computacional : Redes Neuronales Artificiales, Algoritmos Genéticos, Lógica Borrosa, Aprendizaje Automático. En esta edición han participado 150 investigadores.This paper aims at assessing spino-cerebellar type 2 ataxiaby classifying electrooculography records into registers corresponding to healthy, presymptomatic and ill individuals. The primary used technique is the convolutional neural network applied to the time series of eye movements, called saccades. The problem is exceptionally hard, though, because the recorded saccadic movements for presymptomatic cases often do not substantially di er from those of healthy individuals. Precisely
this distinction is of the utmost clinical importance, since early intervention on presymptomatic patients can ameliorate symptoms or at least slow their progression. Yet, each register contains a number of saccades that, although not consistent with the current label, have not been considered indicative of another class by the examining physicians. As a consequence, an unsupervised learning mechanism may be more suitable to handle this form of misclassi cation. Thus, our proposal introduces the
k-means approach and the SOM method, as complementary techniques to analyse the time series. The three techniques operating in tandem lead to a well performing solution to this diagnosis problem.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech.
Universidad de Granada, Universitat Politècnica de Catalunya, Universidad de Las Palmas de Gran Canaria, Springe
Microbial community pattern detection in human body habitats via ensemble clustering framework
The human habitat is a host where microbial species evolve, function, and
continue to evolve. Elucidating how microbial communities respond to human
habitats is a fundamental and critical task, as establishing baselines of human
microbiome is essential in understanding its role in human disease and health.
However, current studies usually overlook a complex and interconnected
landscape of human microbiome and limit the ability in particular body habitats
with learning models of specific criterion. Therefore, these methods could not
capture the real-world underlying microbial patterns effectively. To obtain a
comprehensive view, we propose a novel ensemble clustering framework to mine
the structure of microbial community pattern on large-scale metagenomic data.
Particularly, we first build a microbial similarity network via integrating
1920 metagenomic samples from three body habitats of healthy adults. Then a
novel symmetric Nonnegative Matrix Factorization (NMF) based ensemble model is
proposed and applied onto the network to detect clustering pattern. Extensive
experiments are conducted to evaluate the effectiveness of our model on
deriving microbial community with respect to body habitat and host gender. From
clustering results, we observed that body habitat exhibits a strong bound but
non-unique microbial structural patterns. Meanwhile, human microbiome reveals
different degree of structural variations over body habitat and host gender. In
summary, our ensemble clustering framework could efficiently explore integrated
clustering results to accurately identify microbial communities, and provide a
comprehensive view for a set of microbial communities. Such trends depict an
integrated biography of microbial communities, which offer a new insight
towards uncovering pathogenic model of human microbiome.Comment: BMC Systems Biology 201
Status and Future Perspectives for Lattice Gauge Theory Calculations to the Exascale and Beyond
In this and a set of companion whitepapers, the USQCD Collaboration lays out
a program of science and computing for lattice gauge theory. These whitepapers
describe how calculation using lattice QCD (and other gauge theories) can aid
the interpretation of ongoing and upcoming experiments in particle and nuclear
physics, as well as inspire new ones.Comment: 44 pages. 1 of USQCD whitepapers
Towards Operator-less Data Centers Through Data-Driven, Predictive, Proactive Autonomics
Continued reliance on human operators for managing data centers is a major
impediment for them from ever reaching extreme dimensions. Large computer
systems in general, and data centers in particular, will ultimately be managed
using predictive computational and executable models obtained through
data-science tools, and at that point, the intervention of humans will be
limited to setting high-level goals and policies rather than performing
low-level operations. Data-driven autonomics, where management and control are
based on holistic predictive models that are built and updated using live data,
opens one possible path towards limiting the role of operators in data centers.
In this paper, we present a data-science study of a public Google dataset
collected in a 12K-node cluster with the goal of building and evaluating
predictive models for node failures. Our results support the practicality of a
data-driven approach by showing the effectiveness of predictive models based on
data found in typical data center logs. We use BigQuery, the big data SQL
platform from the Google Cloud suite, to process massive amounts of data and
generate a rich feature set characterizing node state over time. We describe
how an ensemble classifier can be built out of many Random Forest classifiers
each trained on these features, to predict if nodes will fail in a future
24-hour window. Our evaluation reveals that if we limit false positive rates to
5%, we can achieve true positive rates between 27% and 88% with precision
varying between 50% and 72%.This level of performance allows us to recover
large fraction of jobs' executions (by redirecting them to other nodes when a
failure of the present node is predicted) that would otherwise have been wasted
due to failures. [...
Tackling Exascale Software Challenges in Molecular Dynamics Simulations with GROMACS
GROMACS is a widely used package for biomolecular simulation, and over the
last two decades it has evolved from small-scale efficiency to advanced
heterogeneous acceleration and multi-level parallelism targeting some of the
largest supercomputers in the world. Here, we describe some of the ways we have
been able to realize this through the use of parallelization on all levels,
combined with a constant focus on absolute performance. Release 4.6 of GROMACS
uses SIMD acceleration on a wide range of architectures, GPU offloading
acceleration, and both OpenMP and MPI parallelism within and between nodes,
respectively. The recent work on acceleration made it necessary to revisit the
fundamental algorithms of molecular simulation, including the concept of
neighborsearching, and we discuss the present and future challenges we see for
exascale simulation - in particular a very fine-grained task parallelism. We
also discuss the software management, code peer review and continuous
integration testing required for a project of this complexity.Comment: EASC 2014 conference proceedin
- …