6,504 research outputs found
Adversarial Sets for Regularising Neural Link Predictors
In adversarial training, a set of models learn together by pursuing competing
goals, usually defined on single data instances. However, in relational
learning and other non-i.i.d domains, goals can also be defined over sets of
instances. For example, a link predictor for the is-a relation needs to be
consistent with the transitivity property: if is-a(x_1, x_2) and is-a(x_2, x_3)
hold, is-a(x_1, x_3) needs to hold as well. Here we use such assumptions for
deriving an inconsistency loss, measuring the degree to which the model
violates the assumptions on an adversarially-generated set of examples. The
training objective is defined as a minimax problem, where an adversary finds
the most offending adversarial examples by maximising the inconsistency loss,
and the model is trained by jointly minimising a supervised loss and the
inconsistency loss on the adversarial examples. This yields the first method
that can use function-free Horn clauses (as in Datalog) to regularise any
neural link predictor, with complexity independent of the domain size. We show
that for several link prediction models, the optimisation problem faced by the
adversary has efficient closed-form solutions. Experiments on link prediction
benchmarks indicate that given suitable prior knowledge, our method can
significantly improve neural link predictors on all relevant metrics.Comment: Proceedings of the 33rd Conference on Uncertainty in Artificial
Intelligence (UAI), 201
A new look at microlensing limits on dark matter in the Galactic halo
The motivation for this paper is to review the limits set on the MACHO
content of the Galactic halo by microlensing experiments in the direction of
the Large Magellanic Cloud. This has been prompted by recent measurements of
the Galactic rotation curve, which suggest that the limits have been biassed by
the assumption of an over-massive halo. The paper first discusses the security
of the detection efficiency calculations which are central to deriving the
MACHO content of the Galactic halo. It then sets out to compare the rotation
curves from various halo models with recent observations, with a view to
establishing what limits can be put on an all-MACHO halo. The main thrust of
the paper is to investigate whether lighter halo models which are consistent
with microlensing by an all-MACHO halo are also consistent with recent measures
of the Galactic rotation curve. In this case the population of bodies
discovered by the MACHO collaboration would make up the entire dark matter
content of the Galactic halo. The main result of this paper is that it is easy
to find low mass halo models consistent with the observed Galactic rotation
curve, which also imply an optical depth to microlensing similar to that found
by the MACHO collaboration. This means that all-MACHO halos cannot be ruled out
on the basis of their observations. In conclusion, limits placed on the MACHO
content of the Galactic halo from microlensing surveys in the Magellanic Clouds
are inconsistent and model dependent, and do not provide a secure basis for
rejecting an all-MACHO halo.Comment: 8 pages, 4 figures, accepted for publication in A&
Mass conserved elementary kinetics is sufficient for the existence of a non-equilibrium steady state concentration
Living systems are forced away from thermodynamic equilibrium by exchange of
mass and energy with their environment. In order to model a biochemical
reaction network in a non-equilibrium state one requires a mathematical
formulation to mimic this forcing. We provide a general formulation to force an
arbitrary large kinetic model in a manner that is still consistent with the
existence of a non-equilibrium steady state. We can guarantee the existence of
a non-equilibrium steady state assuming only two conditions; that every
reaction is mass balanced and that continuous kinetic reaction rate laws never
lead to a negative molecule concentration. These conditions can be verified in
polynomial time and are flexible enough to permit one to force a system away
from equilibrium. In an expository biochemical example we show how a
reversible, mass balanced perpetual reaction, with thermodynamically infeasible
kinetic parameters, can be used to perpetually force a kinetic model of
anaerobic glycolysis in a manner consistent with the existence of a steady
state. Easily testable existence conditions are foundational for efforts to
reliably compute non-equilibrium steady states in genome-scale biochemical
kinetic models.Comment: 11 pages, 2 figures (v2 is now placed in proper context of the
excellent 1962 paper by James Wei entitled "Axiomatic treatment of chemical
reaction systems". In addition, section 4, on "Utility of steady state
existence theorem" has been expanded.
The Nature of the Warm/Hot Intergalactic Medium I. Numerical Methods, Convergence, and OVI Absorption
We perform a series of cosmological simulations using Enzo, an Eulerian
adaptive-mesh refinement, N-body + hydrodynamical code, applied to study the
warm/hot intergalactic medium. The WHIM may be an important component of the
baryons missing observationally at low redshift. We investigate the dependence
of the global star formation rate and mass fraction in various baryonic phases
on spatial resolution and methods of incorporating stellar feedback. Although
both resolution and feedback significantly affect the total mass in the WHIM,
all of our simulations find that the WHIM fraction peaks at z ~ 0.5, declining
to 35-40% at z = 0. We construct samples of synthetic OVI absorption lines from
our highest-resolution simulations, using several models of oxygen ionization
balance. Models that include both collisional ionization and photoionization
provide excellent fits to the observed number density of absorbers per unit
redshift over the full range of column densities (10^13 cm-2 <= N_OVI <= 10^15
cm^-2). Models that include only collisional ionization provide better fits for
high column density absorbers (N_OVI > 10^14 cm^-2). The distribution of OVI in
density and temperature exhibits two populations: one at T ~ 10^5.5 K
(collisionally ionized, 55% of total OVI) and one at T ~ 10^4.5 K
(photoionized, 37%) with the remainder located in dense gas near galaxies.
While not a perfect tracer of hot gas, OVI provides an important tool for a
WHIM baryon census.Comment: 22 pages, 21 figures, emulateapj, accepted for publication in Ap
Hepatitis C, mental health and equity of access to antiviral therapy: a systematic narrative review
Introduction. Access to hepatitis C (hereafter HCV) antiviral therapy has commonly excluded populations with mental health and substance use disorders because they were considered as having contraindications to treatment, particularly due to the neuropsychiatric effects of interferon that can occur in some patients. In this review we examined access to HCV interferon antiviral therapy by populations with mental health and substance use problems to identify the evidence and reasons for exclusion. Methods. We searched the following major electronic databases for relevant articles: PsycINFO, Medline, CINAHL, Scopus, Google Scholar. The inclusion criteria comprised studies of adults aged 18 years and older, peer-reviewed articles, date range of (2002-2012) to include articles since the introduction of pegylated interferon with ribarvirin, and English language. The exclusion criteria included articles about HCV populations with medical co-morbidities, such as hepatitis B (hereafter HBV) and human immunodeficiency virus (hereafter HIV), because the clinical treatment, pathways and psychosocial morbidity differ from populations with only HCV. We identified 182 articles, and of these 13 met the eligibility criteria. Using an approach of systematic narrative review we identified major themes in the literature. Results: Three main themes were identified including: (1) pre-treatment and preparation for antiviral therapy, (2) adherence and treatment completion, and (3) clinical outcomes. Each of these themes was critically discussed in terms of access by patients with mental health and substance use co-morbidities demonstrating that current research evidence clearly demonstrates that people with HCV, mental health and substance use co-morbidities have similar clinical outcomes to those without these co-morbidities. Conclusions: While research evidence is largely supportive of increased access to interferon by people with HCV, mental health and substance use co-morbidities, there is substantial further work required to translate evidence into clinical practice. Further to this, we conclude that a reconsideration of the appropriateness of the tertiary health service model of care for interferon management is required and exploration of the potential for increased HCV care in primary health care settings
Resource optimization‐based software risk reduction model for large‐scale application development
Software risks are a common phenomenon in the software development lifecycle, and risks emerge into larger problems if they are not dealt with on time. Software risk management is a strategy that focuses on the identification, management, and mitigation of the risk factors in the software development lifecycle. The management itself depends on the nature, size, and skill of the project under consideration. This paper proposes a model that deals with identifying and dealing with the risk factors by introducing different observatory and participatory project factors. It is as-sumed that most of the risk factors can be dealt with by doing effective business processing that in response deals with the orientation of risks and elimination or reduction of those risk factors that emerge over time. The model proposes different combinations of resource allocation that can help us conclude a software project with an extended amount of acceptability. This paper presents a Risk Reduction Model, which effectively handles the application development risks. The model can syn-chronize its working with medium to large‐scale software projects. The reduction in software failures positively affects the software development environment, and the software failures shall re-duce consequently. © 2021 by the authors. Licensee MDPI, Basel, Switzerland
Systemization of Pluggable Transports for Censorship Resistance
An increasing number of countries implement Internet censorship at different
scales and for a variety of reasons. In particular, the link between the
censored client and entry point to the uncensored network is a frequent target
of censorship due to the ease with which a nation-state censor can control it.
A number of censorship resistance systems have been developed thus far to help
circumvent blocking on this link, which we refer to as link circumvention
systems (LCs). The variety and profusion of attack vectors available to a
censor has led to an arms race, leading to a dramatic speed of evolution of
LCs. Despite their inherent complexity and the breadth of work in this area,
there is no systematic way to evaluate link circumvention systems and compare
them against each other. In this paper, we (i) sketch an attack model to
comprehensively explore a censor's capabilities, (ii) present an abstract model
of a LC, a system that helps a censored client communicate with a server over
the Internet while resisting censorship, (iii) describe an evaluation stack
that underscores a layered approach to evaluate LCs, and (iv) systemize and
evaluate existing censorship resistance systems that provide link
circumvention. We highlight open challenges in the evaluation and development
of LCs and discuss possible mitigations.Comment: Content from this paper was published in Proceedings on Privacy
Enhancing Technologies (PoPETS), Volume 2016, Issue 4 (July 2016) as "SoK:
Making Sense of Censorship Resistance Systems" by Sheharbano Khattak, Tariq
Elahi, Laurent Simon, Colleen M. Swanson, Steven J. Murdoch and Ian Goldberg
(DOI 10.1515/popets-2016-0028
- …