386 research outputs found
Locally optimal control of continuous variable entanglement
We consider a system of two bosonic modes each subject to the dynamics
induced by a thermal Markovian environment and we identify instantaneous, local
symplectic controls that minimise the loss of entanglement in the Gaussian
regime. By minimising the decrease of the logarithmic negativity at every
instant in time, it will be shown that a non-trivial, finite amount of local
squeezing helps to counter the effect of decoherence during the evolution. We
also determine optimal control routines in the more restrictive scenario where
the control operations are applied on only one of the two modes. We find that
applying an instantaneous control only at the beginning of the dynamics, i.e.
preparing an appropriate initial state, is the optimal strategy for states with
symmetric correlations and when the dynamics is the same on both modes. More
generally, even in asymmetric cases, the delayed decay of entanglement
resulting from the optimal preparation of the initial state with no further
action turns out to be always very close to the optimised control where
multiple operations are applied during the evolution. Our study extends
directly to mono-symmetric systems of any number of modes, i.e. to systems that
are invariant under any local permutation of the modes within any one
partition, as they are locally equivalent to two-mode systems.Comment: 10 pages, 6 figures, still no joke
Overhead di riconfigurazione per Memorie Cache D-Nuca Way-adaptable: impatto sulle prestazioni e sul consumo di potenza
Il lavoro di tesi si basa sullo studio dellâOverhead di riconfigurazione di memorie cache D-Nuca way-adaptable con lâobiettivo di limitare il consumo di potenza mantenendo le prestazioni su buoni livelli
Digital Enablers for New Decision Journeys: Creating and Adopting Digital Touch Points â Sorgenia
The case study highlights the effort of a utility company to rethink its marketing approach reorganizing activities along the Customer Decision Journey. Main management challenges were: (A) Identifying digital enablers for CDJs while adapting the business to digital innovation; (B) Highlighting Sorgeniaâs positioning as the first non-incumbent Italian energy company in an overcrowded market formerly characterized by a semi-monopolistic regime, subsequent to the market liberalization foreseen for July 2020; (C) Creating and adopting digital touchpoints to personalize the energy offers, to increase simplicity and to reinforce environmental sustainability in the long term; (D) Underlying Sorgeniaâs distinctiveness as Digital Energy Company
Outlier admissions of medical patients: Prognostic implications of outlying patients. The experience of the Hospital of Mestre
ABSTRACT
The admission of a patient in wards other than the appropriate ones, known as the patient outlying phenomenon, involves
both Medicine and Geriatric Units of many Hospitals. The aims were to learn more about the prognosis of the outlying patients,
we investigated 3828 consecutive patients hospitalized in Medicine and Geriatrics of our hub Hospital during the year 2012.
We compared patients\u2019 mean hospital length of stay, survival, and early readmission according to their outlying status. The
mean hospital length of stay did not significantly differ between the two groups, either for Medicine (9.8 days for outliers and
10.0 for in-ward) or Geriatrics (13.0 days for both). However, after adjustment for age and sex, the risk of death was about
twice as high for outlier patients admitted into surgical compared to medical areas (hazard ratio 1.8, 1.2-2.5 95% confidence interval).
Readmission within 90 days from the first discharge was more frequent for patients admitted as outliers (26.1% vs
14.2%, P<0.0001). We highlight some critical aspects of an overcrowded hospital, as the shortage of beds in Medicine and Geriatrics
and the potential increased clinical risk denoted by deaths or early readmission for medical outlier patients when assigned
to inappropriate wards. There is the need to reorganize beds allocation involving community services, improve in-hospital bed
management, an extent diagnostic procedures for outlier patients admitted in nonmedical wards
Approximation of bayesian Hawkes process models with Inlabru
Hawkes process are very popular mathematical tools for modelling phenomena
exhibiting a \textit{self-exciting} or \textit{self-correcting} behaviour.
Typical examples are earthquakes occurrence, wild-fires, drought,
capture-recapture, crime violence, trade exchange, and social network activity.
The widespread use of Hawkes process in different fields calls for fast,
reproducible, reliable, easy-to-code techniques to implement such models. We
offer a technique to perform approximate Bayesian inference of Hawkes process
parameters based on the use of the R-package \inlabru. The \inlabru R-package,
in turn, relies on the INLA methodology to approximate the posterior of the
parameters. Our Hawkes process approximation is based on a decomposition of the
log-likelihood in three parts, which are linearly approximated separately. The
linear approximation is performed with respect to the mode of the parameters'
posterior distribution, which is determined with an iterative gradient-based
method. The approximation of the posterior parameters is therefore
deterministic, ensuring full reproducibility of the results. The proposed
technique only requires the user to provide the functions to calculate the
different parts of the decomposed likelihood, which are internally linearly
approximated by the R-package \inlabru. We provide a comparison with the
\bayesianETAS R-package which is based on an MCMC method. The two techniques
provide similar results but our approach requires two to ten times less
computational time to converge, depending on the amount of data.Comment: 2o pages, 7 figures, 5 table
Adenoma of the Ampulla of Vater: A Genetic Condition?
The etiology of adenoma of the ampulla of Vater is
not well understood. Previous authors reported the
association of this neoplasm with polycystic kidney
disease of two fraternal sisters. They concluded that
these two conditions were somehow related. We
describe a case of ampullary adenoma associated
with polycystic kidney disease. This presentation
raises again the question of a possible link between
these two diseases
Modelling seismicity as a spatio-temporal point process using inlabru
Reliable deterministic prediction of earthquake occurrence is not possible at present, and may never be. In the absence of a reliable deterministic model, we need alternate strategies to manage the seismic hazard or the risk. This involves making statements of the likelihood or earthquake occurrence in space and time, including a fair and accurate description of the uncertainty around statements used in operational decision-making. Probabilistic Seismic Hazard Analysis (PSHA) and Operational Earthquake Forecasting (OEF) have the role of providing probabilistic statements on the hazard associated with earthquakes on long-term (decades to centuries) and short-term (days to decades) time frames respectively. Both
PSHA and OEF rely on a source model able to describe the occurrence of earthquakes.
PSHA models are commonly modelled using a spatially-variable Poisson process to describe earthquake occurrence. Therefore, they are calibrated on declustered catalogues which retains only the largest earthquakes in a sequence. OEF models, on the other hand, are commonly time-dependent models which describes the occurrence of all the events above a certain magnitude threshold including dependent events such as aftershocks or swarms. They are
calibrated on the full earthquake catalogue and provide accurate descriptions of the clustering process and the time-evolution of earthquake sequences. The Epidemic-Type Aftershock Sequence (ETAS) model is the most commonly used model as time-dependent seismicity model and belongs to the general class of Hawkes (or self-exciting) processes. Under the ETAS model, any earthquake in the sequence has the ability of inducing (or triggering) its own subsequence of earthquakes in a cascade of events, as commonly observed in nature. The earthquake catalogue is then the union of a set of events occurring independently from each other (background events) and a set of events which have been induced or triggered by
another (aftershocks).
The reliability of PSHA or OEF strategies depends upon the reliability of the source model used to describe earthquake occurrence. In order to improve the source model, we need the ability to (a) incorporate hypotheses on earthquake occurrence in a model, and (b) validate the model against observed data. Both tasks are problematic. Indeed, the complex mathematical form of the ETAS model requires ad-hoc methodologies to perform inference on the model parameters. These methodologies then need further modification if the classical
ETAS model is adjusted to introduce new hypotheses. Comparing forecasts produced by models incorporating different hypotheses which are and calibrated with different methods is problematic because it is difficult (if not impossible) to determine where the differences in the forecasts are coming from. Therefore, a unique framework capable of supporting ETAS models incorporating different hypotheses would be beneficial. Similarly, the validation step has to be done on models calibrated on the same data and producing forecasts for the same
spatio-temporal region. Moreover the validation must ultimately be done against future data, unknown in the moment in which the forecasts are produced, to ensure that no information about the data used to validate the models is incorporated in the models themselves. Hence, the Collaboratory for the Study of Earthquake Predictability (CSEP) has been founded with the role of gathering forecasting models and running fully-prospective forecasting experiments in an open environment. CSEP ensures that the models are validated fairly and using a set
of community-agreed metrics which measure the agreement between forecasts and data on the outcomes.
In this thesis, I present and apply a new Bayesian approximation technique for Hawkes process models (including ETAS). I also demonstrate the importance of one of the statistical properties that scores used to rank competing forecasts need to have in order to provide trustworthy results. The Bayesian framework allows an accurate description of the uncertainty around model parameters which can then be propagated to any quantity of interest. In
the context of Bayesian statistics, the most commonly used techniques to perform inference are Markov Chain Monte Carlo (MCMC) techniques which are sampling-based methods. Instead, I use the Integrated Nested Laplace Approximation (INLA) to provide a deterministic approximation of the parameter posterior distribution instead of the random sampling. INLA is faster than MCMC for problems involving a large number of correlated parameters and offers an alternative way to implement complex statistical models which are infeasible (from a computational point of view) with MCMC. This provides researchers and practitioners with a statistical framework to formulate ETAS models incorporating different hypotheses, produce forecasts that accounts for uncertainty, and test them using CSEP procedures. I build
on the work done to implement time-independent models for seismicity with INLA which provided a framework to study the effect of covariates such as depth, GPS displacement, heatflow, strain rate, and distance to the nearest fault but lacked the ability to describe the clustering process of earthquakes. I show that this work can be extended to include time-dependent Hawkes process models and run in a reasonable computational time using INLA. In this framework, the information from covariates can be incorporated both in modelling the rate of background events, and in modelling the number aftershocks. This resembles how information on covariates is incorporated in Generalized Linear Models (GLMs) which are widely used to study the effect of covariates on a range of phenomena. Indeed, this
work offers a way to borrow ideas and techniques used with GLMs and apply them to seismicity analyses. To make the proposed technique widely accessible, I have developed a new R-package called ETAS.inlabru which offers user-friendly access to the proposed methodology. The ETAS.inlabru package is based on the inlabru R-package which offers access to the INLA methodology. In this thesis, I compared our approach with the MCMC technique implemented through the bayesianETAS package and shows that ETAS.inlabru provides similar results to bayesianETAS, but it is faster, scales more efficiently increasing the amount of data, and can support a wider range of ETAS models, specifically those involving multiple covariates. I believe that this work provides users with a reliable Bayesian framework for the
ETAS model alleviating the burden of modifying/coding their own optimization routines and allowing more flexibility in the range of hypotheses that can be incorporated and validated. In this thesis, I have analysed the 2009 LâAquila and 2016 Amatrice seismic sequences occurred in central Italy and found that the depth of the events have a negative effect on the aftershock productivity, and that models involving covariates show a better fit to the data than the classical ETAS model.
On the statistical properties that scores needs to posses to provide trustworthy rankings of competing forecasts, I focus on the notion of proper scores. I show that the Parimutuel Gambling (PG) score, used to rank forecasts in previous CSEP experiments, has been used in situations in which is not proper. Indeed, I demonstrate that the PG score is proper only in a specific situation and improper in general. I compare its performances with two proper alternatives: the Brier and the Logarithmic (Log) scores. The simulation procedure employed for this part of the thesis can be easily adapted to study the properties of other validation procedures as the ones used in CSEP or to determine important quantities for the experimental design such as the amount of data with which the comparison should be performed. This contributes to the wider discussion on the statistical properties of CSEP tests, and is an additional step in determining sanity-checks that scoring rules have to pass before being used to validate earthquake forecasts in CSEP experiments
The Role of Strategic Fundraising in Marketing Plan of Non-profit Companies: The Case of Susan G. Komen Italy
The case is mainly focused on the consequences of the Covid-19 pandemic on Susan G. Komen Italia, on all the initiatives put through by the association and on the role of Strategic Fundraising in NPOsâ marketing plans.
The Italian charity sector is populated by small and medium entities competing to âwinâ donors supporting their cause, and in turn organisations of smaller scale have to confront with larger NPOs, both cancer and non-cancer focused, well-known throughout the National territory , which are favoured by their brand equity and their past reputation, leading to an intense competition in terms of potential âdonors walletâ. This framework has undergone several challenges related to the worldwide spread of Covid-19: the pandemic caused dramatic changes in terms of events and initiatives to be rescheduled or rearranged, favouring the shift to online instruments. The ânext normalityâ driven by the lockdown has become a trigger for the creativity of marketers in several different fields, including NPOsâ sector. With this in mind, this paper highlights Komenâs effort in Italy to rethink its marketing and fundraising approach reorganizing activities along the Donor Decision Journey. The case focuses on analysing strategic fundraising as the key element of NPOsâ marketing plans
- âŠ