3,395 research outputs found
Thermohaline circulation stability: a box model study - Part II: coupled atmosphere-ocean model
A thorough analysis of the stability of a coupled version of an
inter-hemispheric 3-box model of Thermohaline Circulation (THC) is presented.
This study follows a similarly structured analysis on an uncoupled version of
the same model presented in Part I. We study how the strength of THC changes
when the system undergoes forcings representing global warming conditions. Each
perturbation to the initial equilibrium is characterized by the total radiative
forcing realized, by the rate of increase, and by the North-South asymmetry.
The choice of suitably defined metrics allows us to determine the boundary
dividing the set of radiative forcing scenarios that lead the system to
equilibria characterized by a THC pattern similar to the present one, from
those that drive the system to equilibria where the THC is reversed. We also
consider different choices for the atmospheric transport parameterizations and
for the ratio between the high latitude to tropical radiative forcing. We
generally find that fast forcings are more effective than slow forcings in
disrupting the present THC pattern, forcings that are stronger in the northern
box are also more effective in destabilizing the system, and that very slow
forcings do not destabilize the system whatever their asymmetry, unless the
radiative forcings are very asymmetric and the atmospheric transport is a
relatively weak function of the meridional temperature gradient. The changes in
the strength of the THC are primarily forced by changes in the latent heat
transport in the hemisphere, because of its sensitivity to temperature that
arises from the Clausius-Clapeyron relation.Comment: 34 pages, 10 figure
FaaSdom: A Benchmark Suite for Serverless Computing
Serverless computing has become a major trend among cloud providers. With
serverless computing, developers fully delegate the task of managing the
servers, dynamically allocating the required resources, as well as handling
availability and fault-tolerance matters to the cloud provider. In doing so,
developers can solely focus on the application logic of their software, which
is then deployed and completely managed in the cloud. Despite its increasing
popularity, not much is known regarding the actual system performance
achievable on the currently available serverless platforms. Specifically, it is
cumbersome to benchmark such systems in a language- or runtime-independent
manner. Instead, one must resort to a full application deployment, to later
take informed decisions on the most convenient solution along several
dimensions, including performance and economic costs. FaaSdom is a modular
architecture and proof-of-concept implementation of a benchmark suite for
serverless computing platforms. It currently supports the current mainstream
serverless cloud providers (i.e., AWS, Azure, Google, IBM), a large set of
benchmark tests and a variety of implementation languages. The suite fully
automatizes the deployment, execution and clean-up of such tests, providing
insights (including historical) on the performance observed by serverless
applications. FaaSdom also integrates a model to estimate budget costs for
deployments across the supported providers. FaaSdom is open-source and
available at https://github.com/bschitter/benchmark-suite-serverless-computing.Comment: ACM DEBS'2
Neural Network Approach to the Simulation of Entangled States with One Bit of Communication
Bell's theorem states that Local Hidden Variables (LHVs) cannot fully explain
the statistics of measurements on some entangled quantum states. It is natural
to ask how much supplementary classical communication would be needed to
simulate them. We study two long-standing open questions in this field with
neural network simulations and other tools. First, we present evidence that all
projective measurements on partially entangled pure two-qubit states require
only one bit of communication. We quantify the statistical distance between the
exact quantum behaviour and the product of the trained network, or of a
semianalytical model inspired by it. Second, while it is known on general
grounds (and obvious) that one bit of communication cannot eventually reproduce
all bipartite quantum correlation, explicit examples have proved evasive. Our
search failed to find one for several bipartite Bell scenarios with up to 5
inputs and 4 outputs, highlighting the power of one bit of communication in
reproducing quantum correlations.Comment: 11 pages, 7 figures, 4 table
Blind Inpainting of Large-scale Masks of Thin Structures with Adversarial and Reinforcement Learning
An Omnidirectional Approach to Touch-based Continuous Authentication
This paper focuses on how touch interactions on smartphones can provide a
continuous user authentication service through behaviour captured by a
touchscreen. While efforts are made to advance touch-based behavioural
authentication, researchers often focus on gathering data, tuning classifiers,
and enhancing performance by evaluating touch interactions in a sequence rather
than independently. However, such systems only work by providing data
representing distinct behavioural traits. The typical approach separates
behaviour into touch directions and creates multiple user profiles. This work
presents an omnidirectional approach which outperforms the traditional method
independent of the touch direction - depending on optimal behavioural features
and a balanced training set. Thus, we evaluate five behavioural feature sets
using the conventional approach against our direction-agnostic method while
testing several classifiers, including an Extra-Tree and Gradient Boosting
Classifier, which is often overlooked. Results show that in comparison with the
traditional, an Extra-Trees classifier and the proposed approach are superior
when combining strokes. However, the performance depends on the applied feature
set. We find that the TouchAlytics feature set outperforms others when using
our approach when combining three or more strokes. Finally, we highlight the
importance of reporting the mean area under the curve and equal error rate for
single-stroke performance and varying the sequence of strokes separately
Factors associated with spontaneous clearance of chronic hepatitis C virus infection
Background & Aims:
Spontaneous clearance of chronic hepatitis C virus (HCV) infection (CHC) is rare. We conducted a retrospective case-control study to identify rates and factors associated with spontaneous clearance of CHC.
Methods:
We defined cases as individuals who spontaneously resolved CHC, and controls as individuals who remained chronically infected. We used data obtained on HCV testing between 1994 and 2013 in the West of Scotland to infer case/control status. Specifically, untreated patients with ⩾2 sequential samples positive for HCV RNA ⩾6 months apart followed by ⩾1 negative test, and those with ⩾2 positive samples ⩾6 months apart with no subsequent negative samples were identified. Control patients were randomly selected from the second group (4/patient of interest). Case notes were reviewed and patient characteristics obtained.
Results:
25,113 samples were positive for HCV RNA, relating to 10,318 patients. 50 cases of late spontaneous clearance were identified, contributing 241 person-years follow-up. 2,518 untreated, chronically infected controls were identified, contributing 13,766 person-years follow-up, from whom 200 controls were randomly selected. The incidence rate of spontaneous clearance was 0.36/100 person-years follow-up, occurring after a median 50 months’ infection. Spontaneous clearance was positively associated with female gender, younger age at infection, lower HCV RNA load and co-infection with hepatitis B virus. It was negatively associated with current intravenous drug use.
Conclusions:
Spontaneous clearance of CHC occurs infrequently but is associated with identifiable host and viral factors. More frequent HCV RNA monitoring may be appropriate in selected patient groups.
Lay summary:
Clearance of hepatitis C virus infection without treatment occurs rarely once chronic infection has been established. We interrogated a large Scottish patient cohort and found that it was more common in females, patients infected at a younger age or with lower levels of HCV in the blood, and patients co-infected with hepatitis B virus. Patients who injected drugs were less likely to spontaneously clear chronic infection
Maxwell's Demon walks into Wall Street: Stochastic Thermodynamics meets Expected Utility Theory
The interplay between thermodynamics and information theory has a long
history, but its quantitative manifestations are still being explored. We
import tools from expected utility theory from economics into stochastic
thermodynamics. We prove that, in a process obeying Crooks' fluctuation
relations, every R\'enyi divergence between the forward process and
its reverse has the operational meaning of the ``certainty equivalent'' of
dissipated work (or, more generally, of entropy production) for a player with
risk aversion . The two known cases and
are recovered and receive the new interpretation of being associated to a
risk-neutral and an extreme risk-averse player respectively. Among the new
results, the condition for describes the behavior of a risk-seeking
player willing to bet on the transient violations of the second law. Our
approach further leads to a generalized Jarzynski equality, and generalizes to
a broader class of statistical divergences.Comment: 5 pages, 1 figur
- …