13,046 research outputs found
Pion-less effective field theory for atomic nuclei and lattice nuclei
We compute the medium-mass nuclei O and Ca using pionless
effective field theory (EFT) at next-to-leading order (NLO). The low-energy
coefficients of the EFT Hamiltonian are adjusted to experimantal data for
nuclei with mass numbers and , or alternatively to results from
lattice quantum chromodynamics (QCD) at an unphysical pion mass of 806 MeV. The
EFT is implemented through a discrete variable representation in the harmonic
oscillator basis. This approach ensures rapid convergence with respect to the
size of the model space and facilitates the computation of medium-mass nuclei.
At NLO the nuclei O and Ca are bound with respect to decay into
alpha particles. Binding energies per nucleon are 9-10 MeV and 30-40 MeV at
pion masses of 140 MeV and 806 MeV, respectively.Comment: 26 page
Association of Intraocular Pressure With Human Immunodeficiency Virus
PURPOSE: Prior studies have shown an association between human immunodeficiency virus (HIV) and reduced intraocular pressures (IOP). The purpose of this study was to determine if patients with HIV on highly active antiretroviral therapy (HAART) had any difference in their IOP compared with patients without HIV or with HIV who are not on HAART. DESIGN: Retrospective cross-sectional study. METHODS: We included 400 patients from our academic eye center between 2000 and 2016. Group 1 (G1) consisted of patients with HIV on HAART (n = 176), Group 2 (G2) consisted of patients with HIV who were not on HAART (n = 48), and Group 3 (G3) consisted of controls without HIV (n = 176). An analysis of variance (ANOVA) was performed to compare mean IOP values. Multivariate linear and logistic regression models were performed to assess factors impacting IOP. Difference in IOP was the primary outcome being measured. RESULTS: The mean IOPs in mm Hg were 13.7 +/- 5.1 (G1), 13.1 +/- 3.6 (G2), and 17.3 +/- 3.8 (G3), P \u3c .01. In regression modeling, having a CD4 count CONCLUSIONS: Absolute CD4 counts may play a role in IOP fluctuations. This association was found in patients with HIV regardless of whether patients were on HAART
Large-scale exact diagonalizations reveal low-momentum scales of nuclei
Ab initio methods aim to solve the nuclear many-body problem with controlled
approximations. Virtually exact numerical solutions for realistic interactions
can only be obtained for certain special cases such as few-nucleon systems.
Here we extend the reach of exact diagonalization methods to handle model
spaces with dimension exceeding on a single compute node. This allows
us to perform no-core shell model (NCSM) calculations for 6Li in model spaces
up to and to reveal the 4He+d halo structure of this
nucleus. Still, the use of a finite harmonic-oscillator basis implies
truncations in both infrared (IR) and ultraviolet (UV) length scales. These
truncations impose finite-size corrections on observables computed in this
basis. We perform IR extrapolations of energies and radii computed in the NCSM
and with the coupled-cluster method at several fixed UV cutoffs. It is shown
that this strategy enables information gain also from data that is not fully UV
converged. IR extrapolations improve the accuracy of relevant bound-state
observables for a range of UV cutoffs, thus making them profitable tools. We
relate the momentum scale that governs the exponential IR convergence to the
threshold energy for the first open decay channel. Using large-scale NCSM
calculations we numerically verify this small-momentum scale of finite nuclei.Comment: Minor revisions.Accepted for publication in Physical Review
Dataplane Specialization for High-performance OpenFlow Software Switching
OpenFlow is an amazingly expressive dataplane program-
ming language, but this expressiveness comes at a severe
performance price as switches must do excessive packet clas-
sification in the fast path. The prevalent OpenFlow software
switch architecture is therefore built on flow caching, but
this imposes intricate limitations on the workloads that can
be supported efficiently and may even open the door to mali-
cious cache overflow attacks. In this paper we argue that in-
stead of enforcing the same universal flow cache semantics
to all OpenFlow applications and optimize for the common
case, a switch should rather automatically specialize its dat-
aplane piecemeal with respect to the configured workload.
We introduce ES WITCH , a novel switch architecture that
uses on-the-fly template-based code generation to compile
any OpenFlow pipeline into efficient machine code, which
can then be readily used as fast path. We present a proof-
of-concept prototype and we demonstrate on illustrative use
cases that ES WITCH yields a simpler architecture, superior
packet processing speed, improved latency and CPU scala-
bility, and predictable performance. Our prototype can eas-
ily scale beyond 100 Gbps on a single Intel blade even with
complex OpenFlow pipelines
Voltage Stability Analysis of Grid-Connected Wind Farms with FACTS: Static and Dynamic Analysis
Recently, analysis of some major blackouts and failures of power system shows that voltage instability problem has been one of the main reasons of these disturbances and networks collapse. In this paper, a systematic approach to voltage stability analysis using various techniques for the IEEE 14-bus case study, is presented. Static analysis is used to analyze the voltage stability of the system under study, whilst the dynamic analysis is used to evaluate the performance of compensators. The static techniques used are Power Flow, V–P curve analysis, and Q–V modal analysis. In this study, Flexible Alternating Current Transmission system (FACTS) devices- namely, Static Synchronous Compensators (STATCOMs) and Static Var Compensators (SVCs) - are used as reactive power compensators, taking into account maintaining the violated voltage magnitudes of the weak buses within the acceptable limits defined in ANSI C84.1. Simulation results validate that both the STATCOMs and the SVCs can be effectively used to enhance the static voltage stability and increasing network loadability margin. Additionally, based on the dynamic analysis results, it has been shown that STATCOMs have superior performance, in dynamic voltage stability enhancement, compared to SVCs
Module networks revisited: computational assessment and prioritization of model predictions
The solution of high-dimensional inference and prediction problems in
computational biology is almost always a compromise between mathematical theory
and practical constraints such as limited computational resources. As time
progresses, computational power increases but well-established inference
methods often remain locked in their initial suboptimal solution. We revisit
the approach of Segal et al. (2003) to infer regulatory modules and their
condition-specific regulators from gene expression data. In contrast to their
direct optimization-based solution we use a more representative centroid-like
solution extracted from an ensemble of possible statistical models to explain
the data. The ensemble method automatically selects a subset of most
informative genes and builds a quantitatively better model for them. Genes
which cluster together in the majority of models produce functionally more
coherent modules. Regulators which are consistently assigned to a module are
more often supported by literature, but a single model always contains many
regulator assignments not supported by the ensemble. Reliably detecting
condition-specific or combinatorial regulation is particularly hard in a single
optimum but can be achieved using ensemble averaging.Comment: 8 pages REVTeX, 6 figure
Offline Signature Verification by Combining Graph Edit Distance and Triplet Networks
Biometric authentication by means of handwritten signatures is a challenging
pattern recognition task, which aims to infer a writer model from only a
handful of genuine signatures. In order to make it more difficult for a forger
to attack the verification system, a promising strategy is to combine different
writer models. In this work, we propose to complement a recent structural
approach to offline signature verification based on graph edit distance with a
statistical approach based on metric learning with deep neural networks. On the
MCYT and GPDS benchmark datasets, we demonstrate that combining the structural
and statistical models leads to significant improvements in performance,
profiting from their complementary properties
SEISMICITY ANOMALIES OF M 5.0+ EARTHQUAKES IN CHILE DURING 1964-2015
The study of magnitude-frequency distribution of earthquake hazards in a region remains a crucial analysis in seismology. Its significance has varied from seismicity quantification to earthquake prediction. The analysis of seismicity anomalies of magnitude M => 5.0 earthquakes in Chile from 1964 to 2015 was undertaken by the present study with a view of reporting the trend of earthquake occurrences in the region. Chile has an area of about 756, 950 km2 with an extensive coastline of approximately 6,435 kms. It is situated in a highly seismically and volcanically active zone with a long, narrow strip of land between the Andes Mountains to the east and the Pacific Ocean to the west.It borders Peru to the north, Bolivia to the northeast, Argentina to the east and the Drake Passage in the far south. Of a total of 3,893 earthquakes that have been documented historically, magnitudes Richter 5.0 to 5.9 represent 92.6%, magnitudes 6.0 to 6.9 represent 6.8%, magnitudes 7.0 to 7.9 represent 0.6%, and magnitudes 8.0 to 8.9 about 0.1%. The quantity of earthquakes (a-value)
revealed an estimate of 8.4. The b-value was estimated using Gutenberg-Richter (GR) and the Maximum Likelihood Estimation (MLE) methods. The estimated b-value using GR and MLE methods are 0.97 and 1.1 respectively, with an estimated average b-value ≈ 1. The present studies
supprort the conclusion that Chile is seismically very active and prone to the recurrence of moderateto-
great earthquakes in the future
- …