428 research outputs found
Higher-order Clustering and Pooling for Graph Neural Networks
Graph Neural Networks achieve state-of-the-art performance on a plethora of
graph classification tasks, especially due to pooling operators, which
aggregate learned node embeddings hierarchically into a final graph
representation. However, they are not only questioned by recent work showing on
par performance with random pooling, but also ignore completely higher-order
connectivity patterns. To tackle this issue, we propose HoscPool, a
clustering-based graph pooling operator that captures higher-order information
hierarchically, leading to richer graph representations. In fact, we learn a
probabilistic cluster assignment matrix end-to-end by minimising relaxed
formulations of motif spectral clustering in our objective function, and we
then extend it to a pooling operator. We evaluate HoscPool on graph
classification tasks and its clustering component on graphs with ground-truth
community structure, achieving best performance. Lastly, we provide a deep
empirical analysis of pooling operators' inner functioning.Comment: CIKM 202
Violation of the fluctuation-dissipation theorem in glassy systems: basic notions and the numerical evidence
This review reports on the research done during the past years on violations
of the fluctuation-dissipation theorem (FDT) in glassy systems. It is focused
on the existence of a quasi-fluctuation-dissipation theorem (QFDT) in glassy
systems and the currently supporting knowledge gained from numerical simulation
studies. It covers a broad range of non-stationary aging and stationary driven
systems such as structural-glasses, spin-glasses, coarsening systems,
ferromagnetic models at criticality, trap models, models with entropy barriers,
kinetically constrained models, sheared systems and granular media. The review
is divided into four main parts: 1) An introductory section explaining basic
notions related to the existence of the FDT in equilibrium and its possible
extension to the glassy regime (QFDT), 2) A description of the basic analytical
tools and results derived in the framework of some exactly solvable models, 3)
A detailed report of the current evidence in favour of the QFDT and 4) A brief
digression on the experimental evidence in its favour. This review is intended
for inexpert readers who want to learn about the basic notions and concepts
related to the existence of the QFDT as well as for the more expert readers who
may be interested in more specific results.Comment: 120 pages, 37 figures. Topical review paper . Several typos and
misprints corrected, new references included and others updated. to be
published in J. Phys. A (Math. Gen.
A Sensitivity Study of the Sub-Volume and Resolution on the Prediction of Petrophysical Properties
Imperial Users onl
Sistemas granulares evolutivos
Orientador: Fernando Antonio Campos GomideTese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de ComputaçãoResumo: Recentemente tem-se observado um crescente interesse em abordagens de modelagem computacional para lidar com fluxos de dados do mundo real. Métodos e algoritmos têm sido propostos para obtenção de conhecimento a partir de conjuntos de dados muito grandes e, a princÃpio, sem valor aparente. Este trabalho apresenta uma plataforma computacional para modelagem granular evolutiva de fluxos de dados incertos. Sistemas granulares evolutivos abrangem uma variedade de abordagens para modelagem on-line inspiradas na forma com que os humanos lidam com a complexidade. Esses sistemas exploram o fluxo de informação em ambiente dinâmico e extrai disso modelos que podem ser linguisticamente entendidos. Particularmente, a granulação da informação é uma técnica natural para dispensar atenção a detalhes desnecessários e enfatizar transparência, interpretabilidade e escalabilidade de sistemas de informação. Dados incertos (granulares) surgem a partir de percepções ou descrições imprecisas do valor de uma variável. De maneira geral, vários fatores podem afetar a escolha da representação dos dados tal que o objeto representativo reflita o significado do conceito que ele está sendo usado para representar. Neste trabalho são considerados dados numéricos, intervalares e fuzzy; e modelos intervalares, fuzzy e neuro-fuzzy. A aprendizagem de sistemas granulares é baseada em algoritmos incrementais que constroem a estrutura do modelo sem conhecimento anterior sobre o processo e adapta os parâmetros do modelo sempre que necessário. Este paradigma de aprendizagem é particularmente importante uma vez que ele evita a reconstrução e o retreinamento do modelo quando o ambiente muda. Exemplos de aplicação em classificação, aproximação de função, predição de séries temporais e controle usando dados sintéticos e reais ilustram a utilidade das abordagens de modelagem granular propostas. O comportamento de fluxos de dados não-estacionários com mudanças graduais e abruptas de regime é também analisado dentro do paradigma de computação granular evolutiva. Realçamos o papel da computação intervalar, fuzzy e neuro-fuzzy em processar dados incertos e prover soluções aproximadas de alta qualidade e sumário de regras de conjuntos de dados de entrada e saÃda. As abordagens e o paradigma introduzidos constituem uma extensão natural de sistemas inteligentes evolutivos para processamento de dados numéricos a sistemas granulares evolutivos para processamento de dados granularesAbstract: In recent years there has been increasing interest in computational modeling approaches to deal with real-world data streams. Methods and algorithms have been proposed to uncover meaningful knowledge from very large (often unbounded) data sets in principle with no apparent value. This thesis introduces a framework for evolving granular modeling of uncertain data streams. Evolving granular systems comprise an array of online modeling approaches inspired by the way in which humans deal with complexity. These systems explore the information flow in dynamic environments and derive from it models that can be linguistically understood. Particularly, information granulation is a natural technique to dispense unnecessary details and emphasize transparency, interpretability and scalability of information systems. Uncertain (granular) data arise from imprecise perception or description of the value of a variable. Broadly stated, various factors can affect one's choice of data representation such that the representing object conveys the meaning of the concept it is being used to represent. Of particular concern to this work are numerical, interval, and fuzzy types of granular data; and interval, fuzzy, and neurofuzzy modeling frameworks. Learning in evolving granular systems is based on incremental algorithms that build model structure from scratch on a per-sample basis and adapt model parameters whenever necessary. This learning paradigm is meaningful once it avoids redesigning and retraining models all along if the system changes. Application examples in classification, function approximation, time-series prediction and control using real and synthetic data illustrate the usefulness of the granular approaches and framework proposed. The behavior of nonstationary data streams with gradual and abrupt regime shifts is also analyzed in the realm of evolving granular computing. We shed light upon the role of interval, fuzzy, and neurofuzzy computing in processing uncertain data and providing high-quality approximate solutions and rule summary of input-output data sets. The approaches and framework introduced constitute a natural extension of evolving intelligent systems over numeric data streams to evolving granular systems over granular data streamsDoutoradoAutomaçãoDoutor em Engenharia Elétric
Competition and cooperation:aspects of dynamics in sandpiles
In this article, we review some of our approaches to granular dynamics, now
well known to consist of both fast and slow relaxational processes. In the
first case, grains typically compete with each other, while in the second, they
cooperate. A typical result of {\it cooperation} is the formation of stable
bridges, signatures of spatiotemporal inhomogeneities; we review their
geometrical characteristics and compare theoretical results with those of
independent simulations. {\it Cooperative} excitations due to local density
fluctuations are also responsible for relaxation at the angle of repose; the
{\it competition} between these fluctuations and external driving forces, can,
on the other hand, result in a (rare) collapse of the sandpile to the
horizontal. Both these features are present in a theory reviewed here. An arena
where the effects of cooperation versus competition are felt most keenly is
granular compaction; we review here a random graph model, where three-spin
interactions are used to model compaction under tapping. The compaction curve
shows distinct regions where 'fast' and 'slow' dynamics apply, separated by
what we have called the {\it single-particle relaxation threshold}. In the
final section of this paper, we explore the effect of shape -- jagged vs.
regular -- on the compaction of packings near their jamming limit. One of our
major results is an entropic landscape that, while microscopically rough,
manifests {\it Edwards' flatness} at a macroscopic level. Another major result
is that of surface intermittency under low-intensity shaking.Comment: 36 pages, 23 figures, minor correction
Modeling, Characterizing and Reconstructing Mesoscale Microstructural Evolution in Particulate Processing and Solid-State Sintering
abstract: In material science, microstructure plays a key role in determining properties, which further determine utility of the material. However, effectively measuring microstructure evolution in real time remains an challenge. To date, a wide range of advanced experimental techniques have been developed and applied to characterize material microstructure and structural evolution on different length and time scales. Most of these methods can only resolve 2D structural features within a narrow range of length scale and for a single or a series of snapshots. The currently available 3D microstructure characterization techniques are usually destructive and require slicing and polishing the samples each time a picture is taken. Simulation methods, on the other hand, are cheap, sample-free and versatile without the special necessity of taking care of the physical limitations, such as extreme temperature or pressure, which are prominent
issues for experimental methods. Yet the majority of simulation methods are limited to specific circumstances, for example, first principle computation can only handle several thousands of atoms, molecular dynamics can only efficiently simulate a few seconds of evolution of a system with several millions particles, and finite element method can only be used in continuous medium, etc. Such limitations make these individual methods far from satisfaction to simulate macroscopic processes that a material sample undergoes up to experimental level accuracy. Therefore, it is highly desirable to develop a framework that integrate different simulation schemes from various scales
to model complicated microstructure evolution and corresponding properties. Guided by such an objective, we have made our efforts towards incorporating a collection of simulation methods, including finite element method (FEM), cellular automata (CA), kinetic Monte Carlo (kMC), stochastic reconstruction method, Discrete Element Method (DEM), etc, to generate an integrated computational material engineering platform (ICMEP), which could enable us to effectively model microstructure evolution and use the simulated microstructure to do subsequent performance analysis. In this thesis, we will introduce some cases of building coupled modeling schemes and present
the preliminary results in solid-state sintering. For example, we use coupled DEM and kinetic Monte Carlo method to simulate solid state sintering, and use coupled FEM and cellular automata method to model microstrucutre evolution during selective laser sintering of titanium alloy. Current results indicate that joining models from different length and time scales is fruitful in terms of understanding and describing microstructure evolution of a macroscopic physical process from various perspectives.Dissertation/ThesisDoctoral Dissertation Materials Science and Engineering 201
Non-local energetics of random heterogeneous lattices
In this paper, we study the mechanics of statistically non-uniform two-phase
elastic discrete structures. In particular, following the methodology proposed
in (Luciano and Willis, Journal of the Mechanics and Physics of Solids 53,
1505-1522, 2005), energetic bounds and estimates of the Hashin-Shtrikman-Willis
type are developed for discrete systems with a heterogeneity distribution
quantified by second-order spatial statistics. As illustrated by three
numerical case studies, the resulting expressions for the ensemble average of
the potential energy are fully explicit, computationally feasible and free of
adjustable parameters. Moreover, the comparison with reference Monte-Carlo
simulations confirms a notable improvement in accuracy with respect to
approaches based solely on the first-order statistics.Comment: 32 pages, 8 figure
Statistical Physics of Vehicular Traffic and Some Related Systems
In the so-called "microscopic" models of vehicular traffic, attention is paid
explicitly to each individual vehicle each of which is represented by a
"particle"; the nature of the "interactions" among these particles is
determined by the way the vehicles influence each others' movement. Therefore,
vehicular traffic, modeled as a system of interacting "particles" driven far
from equilibrium, offers the possibility to study various fundamental aspects
of truly nonequilibrium systems which are of current interest in statistical
physics. Analytical as well as numerical techniques of statistical physics are
being used to study these models to understand rich variety of physical
phenomena exhibited by vehicular traffic. Some of these phenomena, observed in
vehicular traffic under different circumstances, include transitions from one
dynamical phase to another, criticality and self-organized criticality,
metastability and hysteresis, phase-segregation, etc. In this critical review,
written from the perspective of statistical physics, we explain the guiding
principles behind all the main theoretical approaches. But we present detailed
discussions on the results obtained mainly from the so-called
"particle-hopping" models, particularly emphasizing those which have been
formulated in recent years using the language of cellular automata.Comment: 170 pages, Latex, figures include
- …