853 research outputs found
Informing implementation of quality improvement in Australian primary care
Background: Quality Improvement (QI) initiatives in primary care are effective at improving uptake of evidence based guidelines, but are difficult to implement and sustain. In Australia meso-level health organisations such as Primary health care Organisations (PHCO) offer new opportunities to implement area-wide QI programs. This study sought to identify enablers and barriers to implementation of an existing Australian QI program and to identify strategic directions that PHCOs can use in the ongoing development of QI in this environment.
Methods: Semi-structured telephone interviews were conducted with 15 purposively selected program staff and participants from the Australian Primary Care Collaborative (APCC) QI program. Interviewees included seven people involved in design, administration and implementation of the APCC program and eight primary care providers (seven General Practitioners (GPs) and one practice nurse) who had participated in the program from 2004 to 2014. Interviewees were asked to describe their experience of the program and reflect on what enabled or impeded its implementation. Interviews were recorded, transcribed and iteratively analysed, with early analysis informing subsequent interviews. Identified themes and their implications were reviewed by a GP expert reference group.
Results: Implementation enablers and barriers were grouped into five thematic areas: (1) leadership, particularly the identification and utilisation of change champions; (2) organisational culture that supports quality improvement; (3) funding incentives that support a culture of quality and innovation; (4) access to and use of accurate data; and 5) design and utilisation of clinical systems that enable and support these issues. In all of these areas, the active involvement of an overarching external support organisation was considered a key ingredient to successful implementation.
Conclusion: There are substantial opportunities for PHCOs to play a pivotal role in QI implementation in Australia and internationally. In developing QI programs and policies, such organisations ought to invest their efforts in: (1) identifying and mentoring local leaders; (2) fostering QI culture via development of local peer networks; (3) developing and advocating for alternative funding models to support and incentivise these activities; (4) investing in data and audit tool infrastructure; and (5) facilitation of systems implementation within primary care practices
Fitting Weibull ACD Models to High Frequency Transactions Data: A Semi-parametric Approach based on Estimating Functions
Autoregressive conditional duration (ACD) models play an important role in financial modeling. This paper considers the estimation of the Weibull ACD model using a semi-parametric approach based on the theory of estimating functions (EF). We apply the EF and the maximum likelihood (ML) methods to a data set given in Tsay (2003, p203) to compare these two methods. It is shown that the EF approach is easier to apply in practice and gives better estimates than the MLE. Results show that the EF approach is compatible with the ML method in parameter estimation. Furthermore, the computation speed for the EF approach is much faster than for the MLE and therefore offers a significant reduction of the completion time.Volatility, Option pricing, Volatility of volatility, Forecasting
Comparison of Alternative ACD Models via Density and Interval Forecasts: Evidence from the Australian Stock Market
In this paper a number of alternative ACD models are compared using a sample of data for three major companies traded on the Australian Stock Exchange. The comparison is performed by employing the methodology for evaluating density and interval forecasts, developed by Diebold, Gunther and Tay (1998) and Christoffersen (1998), respectively. Our main finding is that the generalized gamma and log-normal distributions for the error terms have similar performance and perform better that the exponential and Weibull distributions. Additionally, there seems to be no substantial difference between the standard ACD specification of Engle and Russel (1998) and the log-ACD specification of Bauwens and Giot (2000).ACD models, Density forecasts Acknowledgements: This paper forms part of an ARC Linkage Grant research project, ÃModelling stock market liquidity in Australia and the Asia Pacific RegionÓ. We are grateful to the Australian Research Council for financial support. The financial data has been graciously provided by the Securities Research Institute (SIRCA) which is our industry partner.
Finite Sample Properties of the QMLE for the Log-ACD Model: Application to Australian Stocks
This paper is concerned with the finite sample properties of the Quasi Maximum Likelihood Estimator (QMLE) of the Logarithmic Autoregressive Conditional Duration (Log-ACD) model. Although the distribution of the QMLE for the log-ACD model is unknown, it is an important issue as it is used widely for testing various market microstructure models and effects. Knowledge of the distribution of the QMLE is crucial for purposes of valid inference and diagnostic checking. This paper investigates the structural and statistical properties of the log-ACD model by establishing the relationship between the log-ACD model and the Autoregressive-Moving Average (ARMA) model. The theoretical results developed in the paper are evaluated using Monte Carlo experiments. The experimental results also provide insights into the finite sample properties of the log-ACD model under different distributional assumptions.Conditional duration, Asymmetry, ACD, Log-ACD, Monte Carlo simulation Acknowledgement: The authors are grateful for the financial support of the Australian Research Council.
Nonlinear time series and neural-network models of exchange rates between the US dollar and major currencies
This paper features an analysis of major currency exchange rate movements in relation to the US dollar, as constituted in US dollar terms. Euro, British pound, Chinese yuan, and Japanese yen are modelled using a variety of non- linear models, including smoot
An integrated general practice and pharmacy-based intervention to promote the use of appropriate preventive medications among individuals at high cardiovascular disease risk: protocol for a cluster randomized controlled trial
Background: Cardiovascular diseases (CVD) are responsible for significant morbidity, premature mortality, and economic burden. Despite established evidence that supports the use of preventive medications among patients at high CVD risk, treatment gaps remain. Building on prior evidence and a theoretical framework, a complex intervention has been designed to address these gaps among high-risk, under-treated patients in the Australian primary care setting. This intervention comprises a general practice quality improvement tool incorporating clinical decision support and audit/feedback capabilities; availability of a range of CVD polypills (fixed-dose combinations of two blood pressure lowering agents, a statin ± aspirin) for prescription when appropriate; and access to a pharmacy-based program to support long-term medication adherence and lifestyle modification.
Methods: Following a systematic development process, the intervention will be evaluated in a pragmatic cluster randomized controlled trial including 70 general practices for a median period of 18 months. The 35 general practices in the intervention group will work with a nominated partner pharmacy, whereas those in the control group will provide usual care without access to the intervention tools. The primary outcome is the proportion of patients at high CVD risk who were inadequately treated at baseline who achieve target blood pressure (BP) and low-density lipoprotein cholesterol (LDL-C) levels at the study end. The outcomes will be analyzed using data from electronic medical records, utilizing a validated extraction tool. Detailed process and economic evaluations will also be performed.
Discussion: The study intends to establish evidence about an intervention that combines technological innovation with team collaboration between patients, pharmacists, and general practitioners (GPs) for CVD prevention.
Trial registration: Australian New Zealand Clinical Trials Registry ACTRN1261600023342
Cross-correlations of the Lyman-alpha forest with weak lensing convergence I: Analytical Estimates of S/N and Implications for Neutrino Mass and Dark Energy
We expect a detectable correlation between two seemingly unrelated
quantities: the four point function of the cosmic microwave background (CMB)
and the amplitude of flux decrements in quasar (QSO) spectra. The amplitude of
CMB convergence in a given direction measures the projected surface density of
matter. Measurements of QSO flux decrements trace the small-scale distribution
of gas along a given line-of-sight. While the cross-correlation between these
two measurements is small for a single line-of-sight, upcoming large surveys
should enable its detection. This paper presents analytical estimates for the
signal to noise (S/N) for measurements of the cross-correlation between the
flux decrement and the convergence and for measurements of the
cross-correlation between the variance in flux decrement and the convergence.
For the ongoing BOSS (SDSS III) and Planck surveys, we estimate an S/N of 30
and 9.6 for these two correlations. For the proposed BigBOSS and ACTPOL
surveys, we estimate an S/N of 130 and 50 respectively. Since the
cross-correlation between the variance in flux decrement and the convergence is
proportional to the fourth power of , the amplitude of these
cross-correlations can potentially be used to measure the amplitude of
at z~2 to 2.5% with BOSS and Planck and even better with future data
sets. These measurements have the potential to test alternative theories for
dark energy and to constrain the mass of the neutrino. The large potential
signal estimated in our analytical calculations motivate tests with non-linear
hydrodynamical simulations and analyses of upcoming data sets.Comment: 24 pages, 9 figure
Inflationary Cosmology and Quantization Ambiguities in Semi-Classical Loop Quantum Gravity
In loop quantum gravity, modifications to the geometrical density cause a
self-interacting scalar field to accelerate away from a minimum of its
potential. In principle, this mechanism can generate the conditions that
subsequently lead to slow-roll inflation. The consequences for this mechanism
of various quantization ambiguities arising within loop quantum cosmology are
considered. For the case of a quadratic potential, it is found that some
quantization procedures are more likely to generate a phase of slow--roll
inflation. In general, however, loop quantum cosmology is robust to ambiguities
in the quantization and extends the range of initial conditions for inflation.Comment: 15 pages, 8 figure
GARMA, HAR and rules of thumb for modelling realized volatility
This paper features an analysis of the relative effectiveness, in terms of the Adjusted R-Square, of a variety of methods of modelling realized volatility (RV), namely the use of Gegenbauer processes in Auto-Regressive Moving Average format, GARMA, as opposed to Heterogenous Auto-Regressive HAR models and simple rules of thumb. The analysis is applied to two data sets that feature the RV of the S&P500 index, as sampled at 5 min intervals, provided by the OxfordMan RV database. The GARMA model does perform slightly better than the HAR model, but both models are matched by a simple rule of thumb regression model based on the application of lags of squared, cubed and quartic, demeaned daily returns
- …
