1,650 research outputs found
Estimation of conditional laws given an extreme component
Let be a bivariate random vector. The estimation of a probability of
the form is challenging when is large, and a
fruitful approach consists in studying, if it exists, the limiting conditional
distribution of the random vector , suitably normalized, given that
is large. There already exists a wide literature on bivariate models for which
this limiting distribution exists. In this paper, a statistical analysis of
this problem is done. Estimators of the limiting distribution (which is assumed
to exist) and the normalizing functions are provided, as well as an estimator
of the conditional quantile function when the conditioning event is extreme.
Consistency of the estimators is proved and a functional central limit theorem
for the estimator of the limiting distribution is obtained. The small sample
behavior of the estimator of the conditional quantile function is illustrated
through simulations.Comment: 32 pages, 5 figur
Синтез нечетких систем автоматического управления генетическими алгоритмами по векторным критериям в среде MATLAB
Задачи многокритериального параметрического синтеза систем управления сведены к задачам оптимизации векторных целевых функций, решение которых позволяет удержать процесс синтеза систем в допустимой области. Для оптимизации векторных целевых функций систем автоматического управления модифицированы бинарный и непрерывный генетические алгоритмы. Показана эффективность применения модифицированных генетических алгоритмов для синтеза систем управления путем оптимизации векторных целевых функций. Рассмотрение задач синтеза линейных и нечетких ПИД регуляторов показало, что в задаче синтеза нечеткого регулятора определяется вектор переменных параметров большей размерности, а в модели системы управления вместо линейных уравнений применяются нелинейные уравнения с использованием системы нечеткого вывода
Classification of non-Riemannian doubled-yet-gauged spacetime
Assuming covariant fields as the `fundamental' variables,
Double Field Theory can accommodate novel geometries where a Riemannian metric
cannot be defined, even locally. Here we present a complete classification of
such non-Riemannian spacetimes in terms of two non-negative integers,
, . Upon these backgrounds, strings become
chiral and anti-chiral over and directions respectively, while
particles and strings are frozen over the directions. In
particular, we identify as Riemannian manifolds, as
non-relativistic spacetime, as Gomis-Ooguri non-relativistic string,
as ultra-relativistic Carroll geometry, and as Siegel's
chiral string. Combined with a covariant Kaluza-Klein ansatz which we further
spell, leads to Newton-Cartan gravity. Alternative to the conventional
string compactifications on small manifolds, non-Riemannian spacetime such as
, may open a new scheme of the dimensional reduction from ten to
four.Comment: 1+41 pages; v2) Refs added; v3) Published version; v4) Sign error in
(2.51) correcte
The Freezeout Hypersurface at LHC from particle spectra: Flavor and Centrality Dependence
We extract the freezeout hypersurface in Pb-Pb collisions at 2760 GeV at the CERN Large Hadron Collider by analysing the data on
transverse momentum spectra within a unified model for chemical and kinetic
freezeout. The study has been done within two different schemes of freezeout,
single freezeout where all the hadrons freezeout together versus double
freezeout where those hadrons with non-zero strangeness content have different
freezeout parameters compared to the non-strange ones. We demonstrate that the
data is better described within the latter scenario. We obtain a strange
freezeout hypersurface which is smaller in volume and hotter compared to the
non-strange freezeout hypersurface for all centralities with a reduction in
around . We observe from the extracted parameters that
the ratio of the transverse size to the freezeout proper time is invariant
under expansion from the strange to the non-strange freezeout surfaces across
all centralities. Moreover, except for the most peripheral bins, the ratio of
the non-strange and strange freezeout proper times is close to .Comment: Final version accepted for publicatio
Comparison of long-term mortality risk following normal exercise vs adenosine myocardial perfusion SPECT
A higher frequency of clinical events has been observed in patients undergoing pharmacological vs exercise myocardial perfusion single-photon emission computed tomography (SPECT). While this difference is attributed to greater age and co-morbidities, it is not known whether these tests also differ in prognostic ability among patients with similar clinical profiles.
We assessed all-cause mortality rates in 6,069 patients, followed for 10.2 ± 1.7 years after undergoing exercise or adenosine SPECT. We employed propensity analysis to match exercise and adenosine subgroups by age, gender, symptoms, and coronary risk factors. Within our propensity-matched cohorts, adenosine patients had an annualized mortality rate event rates that was more than twice that of exercise patients (3.9% vs 1.6%, P < .0001). Differences in mortality persisted among age groups, including those <55 years old. In the exercise cohort, mortality was inversely related to exercise duration, with comparable mortality noted for patients exercising <3 min and those undergoing adenosine testing.
Among patients with normal stress SPECT tests, those undergoing adenosine testing manifest a mortality rate that is substantially higher than that observed among adequately exercising patients, but comparable to that observed among very poorly exercising patients. This elevated risk underscores an important challenge for managing patients undergoing pharmacological stress testing
Development and evaluation of an open source software tool for deidentification of pathology reports
BACKGROUND: Electronic medical records, including pathology reports, are often used for research purposes. Currently, there are few programs freely available to remove identifiers while leaving the remainder of the pathology report text intact. Our goal was to produce an open source, Health Insurance Portability and Accountability Act (HIPAA) compliant, deidentification tool tailored for pathology reports. We designed a three-step process for removing potential identifiers. The first step is to look for identifiers known to be associated with the patient, such as name, medical record number, pathology accession number, etc. Next, a series of pattern matches look for predictable patterns likely to represent identifying data; such as dates, accession numbers and addresses as well as patient, institution and physician names. Finally, individual words are compared with a database of proper names and geographic locations. Pathology reports from three institutions were used to design and test the algorithms. The software was improved iteratively on training sets until it exhibited good performance. 1800 new pathology reports were then processed. Each report was reviewed manually before and after deidentification to catalog all identifiers and note those that were not removed. RESULTS: 1254 (69.7 %) of 1800 pathology reports contained identifiers in the body of the report. 3439 (98.3%) of 3499 unique identifiers in the test set were removed. Only 19 HIPAA-specified identifiers (mainly consult accession numbers and misspelled names) were missed. Of 41 non-HIPAA identifiers missed, the majority were partial institutional addresses and ages. Outside consultation case reports typically contain numerous identifiers and were the most challenging to deidentify comprehensively. There was variation in performance among reports from the three institutions, highlighting the need for site-specific customization, which is easily accomplished with our tool. CONCLUSION: We have demonstrated that it is possible to create an open-source deidentification program which performs well on free-text pathology reports
Strong Interactions of Single Atoms and Photons near a Dielectric Boundary
Modern research in optical physics has achieved quantum control of strong
interactions between a single atom and one photon within the setting of cavity
quantum electrodynamics (cQED). However, to move beyond current
proof-of-principle experiments involving one or two conventional optical
cavities to more complex scalable systems that employ N >> 1 microscopic
resonators requires the localization of individual atoms on distance scales <
100 nm from a resonator's surface. In this regime an atom can be strongly
coupled to a single intracavity photon while at the same time experiencing
significant radiative interactions with the dielectric boundaries of the
resonator. Here, we report an initial step into this new regime of cQED by way
of real-time detection and high-bandwidth feedback to select and monitor single
Cesium atoms localized ~100 nm from the surface of a micro-toroidal optical
resonator. We employ strong radiative interactions of atom and cavity field to
probe atomic motion through the evanescent field of the resonator. Direct
temporal and spectral measurements reveal both the significant role of
Casimir-Polder attraction and the manifestly quantum nature of the atom-cavity
dynamics. Our work sets the stage for trapping atoms near micro- and
nano-scopic optical resonators for applications in quantum information science,
including the creation of scalable quantum networks composed of many
atom-cavity systems that coherently interact via coherent exchanges of single
photons.Comment: 8 pages, 5 figures, Supplemental Information included as ancillary
fil
- …