2,834 research outputs found
A Parameterized Centrality Metric for Network Analysis
A variety of metrics have been proposed to measure the relative importance of
nodes in a network. One of these, alpha-centrality [Bonacich, 2001], measures
the number of attenuated paths that exist between nodes. We introduce a
normalized version of this metric and use it to study network structure,
specifically, to rank nodes and find community structure of the network.
Specifically, we extend the modularity-maximization method [Newman and Girvan,
2004] for community detection to use this metric as the measure of node
connectivity. Normalized alpha-centrality is a powerful tool for network
analysis, since it contains a tunable parameter that sets the length scale of
interactions. By studying how rankings and discovered communities change when
this parameter is varied allows us to identify locally and globally important
nodes and structures. We apply the proposed method to several benchmark
networks and show that it leads to better insight into network structure than
alternative methods.Comment: 11 pages, submitted to Physical Review
Intelligent Self-Repairable Web Wrappers
The amount of information available on the Web grows at an incredible high rate. Systems and procedures devised to extract these data from Web sources already exist, and different approaches and techniques have been investigated during the last years. On the one hand, reliable solutions should provide robust algorithms of Web data mining which could automatically face possible malfunctioning or failures. On the other, in literature there is a lack of solutions about the maintenance of these systems. Procedures that extract Web data may be strictly interconnected with the structure of the data source itself; thus, malfunctioning or acquisition of corrupted data could be caused, for example, by structural modifications of data sources brought by their owners. Nowadays, verification of data integrity and maintenance are mostly manually managed, in order to ensure that these systems work correctly and reliably. In this paper we propose a novel approach to create procedures able to extract data from Web sources -- the so called Web wrappers -- which can face possible malfunctioning caused by modifications of the structure of the data source, and can automatically repair themselves.\u
Towards a Formal Verification Methodology for Collective Robotic Systems
We introduce a UML-based notation for graphically modeling
systemsā security aspects in a simple and intuitive
way and a model-driven process that transforms graphical
specifications of access control policies in XACML. These
XACML policies are then translated in FACPL, a policy
language with a formal semantics, and the resulting policies
are evaluated by means of a Java-based software tool
Low prevalence, quasi-stationarity and power-law distribution in a model of spreading
Understanding how contagions (information, infections, etc) are spread on
complex networks is important both from practical as well as theoretical point
of view. Considerable work has been done in this regard in the past decade or
so. However, most models are limited in their scope and as a result only
capture general features of spreading phenomena. Here, we propose and study a
model of spreading which takes into account the strength or quality of
contagions as well as the local (probabilistic) dynamics occurring at various
nodes. Transmission occurs only after the quality-based fitness of the
contagion has been evaluated by the local agent. The model exhibits
quality-dependent exponential time scales at early times leading to a slowly
evolving quasi-stationary state. Low prevalence is seen for a wide range of
contagion quality for arbitrary large networks. We also investigate the
activity of nodes and find a power-law distribution with a robust exponent
independent of network topology. Our results are consistent with recent
empirical observations.Comment: 7 pages, 8 figures. (Submitted
On Non-Abelian Symplectic Cutting
We discuss symplectic cutting for Hamiltonian actions of non-Abelian compact
groups. By using a degeneration based on the Vinberg monoid we give, in good
cases, a global quotient description of a surgery construction introduced by
Woodward and Meinrenken, and show it can be interpreted in algebro-geometric
terms. A key ingredient is the `universal cut' of the cotangent bundle of the
group itself, which is identified with a moduli space of framed bundles on
chains of projective lines recently introduced by the authors.Comment: Various edits made, to appear in Transformation Groups. 28 pages, 8
figure
Latent Space Model for Multi-Modal Social Data
With the emergence of social networking services, researchers enjoy the
increasing availability of large-scale heterogenous datasets capturing online
user interactions and behaviors. Traditional analysis of techno-social systems
data has focused mainly on describing either the dynamics of social
interactions, or the attributes and behaviors of the users. However,
overwhelming empirical evidence suggests that the two dimensions affect one
another, and therefore they should be jointly modeled and analyzed in a
multi-modal framework. The benefits of such an approach include the ability to
build better predictive models, leveraging social network information as well
as user behavioral signals. To this purpose, here we propose the Constrained
Latent Space Model (CLSM), a generalized framework that combines Mixed
Membership Stochastic Blockmodels (MMSB) and Latent Dirichlet Allocation (LDA)
incorporating a constraint that forces the latent space to concurrently
describe the multiple data modalities. We derive an efficient inference
algorithm based on Variational Expectation Maximization that has a
computational cost linear in the size of the network, thus making it feasible
to analyze massive social datasets. We validate the proposed framework on two
problems: prediction of social interactions from user attributes and behaviors,
and behavior prediction exploiting network information. We perform experiments
with a variety of multi-modal social systems, spanning location-based social
networks (Gowalla), social media services (Instagram, Orkut), e-commerce and
review sites (Amazon, Ciao), and finally citation networks (Cora). The results
indicate significant improvement in prediction accuracy over state of the art
methods, and demonstrate the flexibility of the proposed approach for
addressing a variety of different learning problems commonly occurring with
multi-modal social data.Comment: 12 pages, 7 figures, 2 table
High-Dimensional Menger-Type Curvatures-Part II: d-Separation and a Menagerie of Curvatures
This is the second of two papers wherein we estimate multiscale least squares
approximations of certain measures by Menger-type curvatures. More
specifically, we study an arbitrary d-regular measure on a real separable
Hilbert space. The main result of the paper bounds the least squares error of
approximation at any ball by an average of the discrete Menger-type curvature
over certain simplices in in the ball. A consequent result bounds the
Jones-type flatness by an integral of the discrete curvature over all
simplices. The preceding paper provided the opposite inequalities. Furthermore,
we demonstrate some other discrete curvatures for characterizing uniform
rectifiability and additional continuous curvatures for characterizing special
instances of the (p, q)-geometric property. We also show that a curvature
suggested by Leger (Annals of Math, 149(3), p. 831-869, 1999) does not fit
within our framework.Comment: 32 pages, no figure
Long-term outcomes of fractional flow reserve-guided vs. angiography-guided percutaneous coronary intervention in contemporary practice
Aims Fractional flow reserve (FFR) is the reference standard for the assessment of the functional significance of coronary artery stenoses, but is underutilized in daily clinical practice. We aimed to study long-term outcomes of FFR-guided percutaneous coronary intervention (PCI) in the general clinical practice. Methods and results In this retrospective study, consecutive patients (n = 7358), referred for PCI at the Mayo Clinic between October 2002 and December 2009, were divided in two groups: those undergoing PCI without (PCI-only, n = 6268) or with FFR measurements (FFR-guided, n = 1090). The latter group was further classified as the FFR-Perform group (n = 369) if followed by PCI, and the FFR-Defer group (n = 721) if PCI was deferred. Clinical events were compared during a median follow-up of 50.9 months. The Kaplan-Meier fraction of major adverse cardiac events at 7 years was 57.0% in the PCI-only vs. 50.0% in the FFR-guided group (P = 0.016). Patients with FFR-guided interventions had a non-significantly lower rate of death or myocardial infarction compared with those with angiography-guided interventions [hazard ratio (HR): 0.85, 95% CI: 0.71-1.01, P = 0.06]; the FFR-guided deferred-PCI strategy was independently associated with reduced rate of myocardial infarction (HR: 0.46, 95% CI: 0.26-0.82, P = 0.008). After excluding patients with FFR of 0.75-0.80 and deferring PCI, the use of FFR was significantly associated with reduced rate of death or myocardial infarction (HR: 0.80, 95% CI: 0.66-0.96, P = 0.02). Conclusion In the contemporary practice, an FFR-guided treatment strategy is associated with a favourable long-term outcome. The current study supports the use of the FFR for decision-making in patients undergoing cardiac catheterizatio
A randomised controlled trial of a psychoeducational intervention for women at increased risk of breast cancer
This study aimed to compare the impact of two versions of a psychoeducational written intervention on cancer worry and objective knowledge of breast cancer risk-related topics in women who had been living with an increased risk of familial breast cancer for several years. Participants were randomised to three conditions: scientific and psychosocial information pack (Group 1), scientific information pack only (Group 2) or standard care control (Group 3). They completed postal questionnaires at baseline (nĀ¼163) and\ud
4 weeks (nĀ¼151). As predicted, there was a significant decrease in cancer worry for Group 1, but not Group 2. Objective\ud
knowledge significantly improved for both Group 1 and Group 2 as expected, but not Group 3. However, there was an unpredicted\ud
decline in cancer worry for Group 3. This study supports the value of a scientific and psychosocial information pack in providing up-to-date information related to familial risk of breast cancer for long-term attendees of a familial breast cancer clinic. Further research is warranted to determine how the information pack could be incorporated into the existing clinical service, thus providing these\ud
women with the type of ongoing psychosocial support that many familial breast cancer clinics are currently lacking
Zonotopes and four-dimensional superconformal field theories
The a-maximization technique proposed by Intriligator and Wecht allows us to
determine the exact R-charges and scaling dimensions of the chiral operators of
four-dimensional superconformal field theories. The problem of existence and
uniqueness of the solution, however, has not been addressed in general setting.
In this paper, it is shown that the a-function has always a unique critical
point which is also a global maximum for a large class of quiver gauge theories
specified by toric diagrams. Our proof is based on the observation that the
a-function is given by the volume of a three dimensional polytope called
"zonotope", and the uniqueness essentially follows from Brunn-Minkowski
inequality for the volume of convex bodies. We also show a universal upper
bound for the exact R-charges, and the monotonicity of a-function in the sense
that a-function decreases whenever the toric diagram shrinks. The relationship
between a-maximization and volume-minimization is also discussed.Comment: 29 pages, 15 figures, reference added, typos corrected, version
published in JHE
- ā¦