548 research outputs found
An EF2X Allocation Protocol for Restricted Additive Valuations
We study the problem of fairly allocating a set of indivisible goods to aset of agents. Envy-freeness up to any good (EFX) criteria -- whichrequires that no agent prefers the bundle of another agent after removal of anysingle good -- is known to be a remarkable analogous of envy-freeness when theresource is a set of indivisible goods. In this paper, we investigate EFXnotion for the restricted additive valuations, that is, every good has somenon-negative value, and every agent is interested in only some of the goods. We introduce a natural relaxation of EFX called EFkX which requires that noagent envies another agent after removal of any goods. Our maincontribution is an algorithm that finds a complete (i.e., no good is discarded)EF2X allocation for the restricted additive valuations. In our algorithm wedevise new concepts, namely "configuration" and "envy-elimination" that mightbe of independent interest. We also use our new tools to find an EFX allocation for restricted additivevaluations that discards at most goods. This improvesthe state of the art for the restricted additive valuations by a factor of .<br
Ratio-Balanced Maximum Flows
When a loan is approved for a person or company, the bank is subject to \emph{credit risk}; the risk that the lender defaults. To mitigate this risk, a bank will require some form of \emph{security}, which will be collected if the lender defaults. Accounts can be secured by several securities and a security can be used for several accounts. The goal is to fractionally assign the securities to the accounts so as to balance the risk. This situation can be modelled by a bipartite graph. We have a set of securities and a set of accounts. Each security has a \emph{value} and each account has an \emph{exposure} . If a security can be used to secure an account , we have an edge from to . Let be part of security 's value used to secure account . We are searching for a maximum flow that send at most units out of node and at most units into node . Then is the unsecured part of account . We are searching for the maximum flow that minimizes
Braided Cyclic Cocycles and Non-Associative Geometry
We use monoidal category methods to study the noncommutative geometry of
nonassociative algebras obtained by a Drinfeld-type cochain twist. These are
the so-called quasialgebras and include the octonions as braided-commutative
but nonassociative coordinate rings, as well as quasialgebra versions
\CC_{q}(G) of the standard q-deformation quantum groups. We introduce the
notion of ribbon algebras in the category, which are algebras equipped with a
suitable generalised automorphism , and obtain the required
generalisation of cyclic cohomology. We show that this \emph{braided cyclic
cocohomology} is invariant under a cochain twist. We also extend to our
generalisation the relation between cyclic cohomology and differential calculus
on the ribbon quasialgebra. The paper includes differential calculus and cyclic
cocycles on the octonions as a finite nonassociative geometry, as well as the
algebraic noncommutative torus as an associative example.Comment: 36 pages latex, 9 figure
Using a finite element model to investigate second metatarsal stress during running
This is the author accepted mansucript. the final version is available from the publisher via the link in this recordSecond metatarsal (2MT) stress fracture is a common and burdensome injury amongst runners, however understanding of the risk factors leading to injury is limited. Finite Element (FE) modelling represents a viable biorealistic alternative to invasive studies and simple beam theory models. This study shows the design and validation of a simple subject-specific FE model of the 2MT incorporating geometrically accurate soft tissue and loading. Results show a good comparison with both recent models and bone staple strain gauge data.Engineering and Physical Sciences Research Council (EPSRC
Developing a Finite Element Model to Investigate Second Metatarsal Stress During Running
This is the author accepted manuscript. The final version is available from UKACM via the link in this recordSecond metatarsal (2MT) stress fracture is a common and burdensome injury amongst runners, however understanding of the
risk factors leading to injury is limited. Finite Element (FE) modelling represents a viable biorealistic alternative to invasive studies
and simple beam theory models. This study shows the design and validation of a simple subject-specific FE model of the 2MT
incorporating geometrically accurate soft tissue and loading. Results show a good comparison with both recent models and bone
staple strain gauge data.Engineering and Physical Sciences Research Council (EPSRC
Patient-Led Research Collaborative: embedding patients in the Long COVID narrative
A large subset of patients with coronavirus disease 2019 (COVID-19) are experiencing symptoms well beyond the claimed 2-week recovery period for mild cases. These long-term sequelae have come to be known as Long COVID. Originating out of a dedicated online support group, a team of patients formed the Patient-Led Research Collaborative and conducted the first research on Long COVID experience and symptoms. This article discusses the history and value of patient-centric and patient-led research; the formation of Patient-Led Research Collaborative as well as key findings to date; and calls for the following: the acknowledgement of Long COVID as an illness, an accurate estimate of the prevalence of Long COVID, publicly available basic symptom management, care, and research to not be limited to those with positive polymerase chain reaction and antibody tests, and aggressive research and investigation into the pathophysiology of symptoms
The effect of homozygous deletion of the BBOX1 and Fibin genes on carnitine level and acyl carnitine profile.
BACKGROUND: Carnitine is a key molecule in energy metabolism that helps transport activated fatty acids into the mitochondria. Its homeostasis is achieved through oral intake, renal reabsorption and de novo biosynthesis. Unlike dietary intake and renal reabsorption, the importance of de novo biosynthesis pathway in carnitine homeostasis remains unclear, due to lack of animal models and description of a single patient defective in this pathway.
CASE PRESENTATION: We identified by array comparative genomic hybridization a 42 months-old girl homozygote for a 221 Kb interstitial deletions at 11p14.2, that overlaps the genes encoding Fibin and butyrobetaine-gamma 2-oxoglutarate dioxygenase 1 (BBOX1), an enzyme essential for the biosynthesis of carnitine de novo. She presented microcephaly, speech delay, growth retardation and minor facial anomalies. The levels of almost all evaluated metabolites were normal. Her serum level of free carnitine was at the lower limit of the reference range, while her acylcarnitine to free carnitine ratio was normal.
CONCLUSIONS: We present an individual with a completely defective carnitine de novo biosynthesis. This condition results in mildly decreased free carnitine level, but not in clinical manifestations characteristic of carnitine deficiency disorders, suggesting that dietary carnitine intake and renal reabsorption are sufficient to carnitine homeostasis. Our results also demonstrate that haploinsufficiency of BBOX1 and/or Fibin is not associated with Primrose syndrome as previously suggested
Nash Social Welfare for 2-value Instances
We study the problem of allocating a set of indivisible goods among agents with 2-value additive valuations. Our goal is to find an allocation with maximum Nash social welfare, i.e., the geometric mean of the valuations of the agents. We give a polynomial-time algorithm to find a Nash social welfare maximizing allocation when the valuation functions are integrally 2-valued, i.e., each agent has a value either or for each good, for some positive integer . We then extend our algorithm to find a better approximation factor for general 2-value instances
What can(not) be measured with ton-scale dark matter direct detection experiments
Direct searches for dark matter have prompted in recent years a great deal of
excitement within the astroparticle physics community, but the compatibility
between signal claims and null results of different experiments is far from
being a settled issue. In this context, we study here the prospects for
constraining the dark matter parameter space with the next generation of
ton-scale detectors. Using realistic experimental capabilities for a wide range
of targets (including fluorine, sodium, argon, germanium, iodine and xenon),
the role of target complementarity is analysed in detail while including the
impact of astrophysical uncertainties in a self-consistent manner. We show
explicitly that a multi-target signal in future direct detection facilities can
determine the sign of the ratio of scalar couplings , but not its
scale. This implies that the scalar-proton cross-section is left essentially
unconstrained if the assumption is relaxed. Instead, we find that
both the axial-proton cross-section and the ratio of axial couplings
can be measured with fair accuracy if multi-ton instruments using sodium and
iodine will eventually come online. Moreover, it turns out that future direct
detection data can easily discriminate between elastic and inelastic
scatterings. Finally, we argue that, with weak assumptions regarding the WIMP
couplings and the astrophysics, only the dark matter mass and the inelastic
parameter (i.e. mass splitting) may be inferred from the recoil spectra --
specifically, we anticipate an accuracy of tens of GeV (tens of keV) in the
measurement of the dark matter mass (inelastic parameter).Comment: 31 pages, 7 figures, 7 table
- …