158 research outputs found
The Effect of Medicaid Disease Management Programs on Medicaid Expenditures
Disease Management (DM) programs for Medicaid patients with chronic diseases have become very popular, with a majority of states having introduced some type of DM program in the last decade. These programs provide interventions designed to assist patients and their health care providers appropriately manage their chronic health condition(s) according to established clinical guidelines. Cost-containment has been a key justification for the creation of DM programs, despite mixed evidence that DM actually saves money for the Medicaid program or for society as a whole.
While most studies on the impact of DM focus on estimating the impact of a single DM program, Chapter 2 estimates the average, national impact of state Medicaid DM programs by linking a detailed survey of state Medicaid programs to the nationally representative Medical Panel Expenditure Survey. Difference-in-difference models are used to test the hypothesis that medical expenditures change after a DM program is implemented, exploiting variation in the timing at which state Medicaid programs implemented DM programs. DM coverage also varies within states over time due to variation in program eligibility by disease, insurance category, and/or county of residence. Although the models estimate the effect of DM imprecisely, point estimates are stable across multiple specifications and indicate that DM programs for common chronic diseases may decrease total medical expenditures, potentially by 10 percent or more.
Chapter 3 evaluates one DM program in the state of Georgia using a proprietary data set. By exploiting a natural experiment that delayed the introduction of high-intensity services for several thousand high and moderate risk patients, the research identifies the causal impacts of the program's interventions on total Medicaid expenditures, categories of health care utilization, and other indicators. These patients are observationally similar to those who received interventions at the beginning of the program. For example, I find the interventions lowered health costs and hospital utilization, after controlling for unobservable individual characteristics. Health expenditures were lowered about 4.4 percent for patients with positive expenditures. Heterogeneous treatment effect analysis indicates that the savings were largest at the most expensive tail of the distribution
Identifying the key process factors affecting project performance
Purpose
A construction project traditionally involves a variety of participants. Owners, consultants, and contractors all have diverse opinions and interests, but they all seek to ensure project success. Success is habitually measured as performance output regarding cost, time, and quality. Despite previous research mapping the success and failure factors, construction managers seem to have difficulty in attaining success. To provide clearer guidance on how to fulfill success criteria, the purpose of this paper is to identify the underlying factors that affect performance and thus project success in construction processes.
Design/methodology/approach
A questionnaire survey based on a literature review provided 25 key process factors divided into five key categories. Based on the responses from commonly involved construction parties, the factors were ranked and tested for significant differences between the parties.
Findings
The top five most important process factors were found to relate to the sharing of knowledge and communication. Moreover, testing the ranking for significant differences between owners, consultants, and contractors revealed five differences. The differences related to the interpretation and importance of trust, shared objectives, project coordination, and alternative forms of coordination.
Originality/value
All respondents identify improved knowledge sharing and communication as the key to improved cost, time, and quality performance and are therefore the areas where construction managers need to focus their resources. Thus, improved experience sharing and communication will increase the likelihood of project success, through improving competences, commitment, and coordination.
</jats:sec
Longitudinal study of Salmonella enterica serovar Typhimurium infection in three Danish farrow-to-finish swineherds
A longitudinal study on Salmonella enterica was carried out in 3 Danish farrow-to-finish swineherds in 2001. Litters from each herd were divided into 2 cohorts of 30 pigs each (180 pigs in total). Individual pigs were examined for bacteriology and serology monthly from weaning to slaughter. At weaning, individual sows were also examined for bacteriology and serology. In total, 88 pigs were found to be shedding on ≥1 occasion. Only Salmonella enterica serovar Typhimurium was detected. The culture-prevalence peaked in the nursery, and subsequently declined to undetectable levels before slaughter. The sero-prevalence peaked approximately 60 days after peak culture-prevalence. Salmonella was detected in individual fecal samples at least once in 53% of the pigs, while 62% were sero-positive more than once. Only 3.7% of all pigs were found to be culture-positive on ≥1 occasion. The average shedding time was estimated to have been 18 days
Bacteriological and serological examination and risk factor analysis of Salmonella occurence in sow herds, including risk factors for high Salmonella seroprevalence in receiver finishing herds.
A strong association between the seroprevalence in sows and the occurrence of Salmonella typhimurium among weaners has been shown. As shown several times for finisher herds, the risk-factors; ready-mixed pelleted feed and health status also apply to sow herds. Risk factors on the sow level, for high seroprevalence in finishers have been quantified
Effect of an optimised pelleted diet on Salmonella prevalence and pig productivity
The effect of an optimised, wheat based pelleted diet containing barley, sugar beet pulp and organic acids on Salmonella prevalence and pig productivity was investigated in two finisher herds. The optimised diet significantly reduced Salmonella seroprevalence compared to standard pelleted feed and meal feed. In contrast to previous studies, meal feed did not have a significant Salmonella reducing effect in this study. Meal diet but not the optimised diet had a significant negative effect on pig productivity, compared to the standard pelleted diet. Our results show that the optimised diet is a suitable alternative to wheat based pelleted feed or meal feed in reducing Salmonella prevalence in finisher pigs
Bacteriological and serological examination and risk factor analysis of Salmonella occurrence in sow herds, including risk factors for high Salmonella seroprevalence in receiver finishing herds
A mandatory programme monitoring the occurrence of Salmonella in pork at slaughterhouses and a serological monitoring of slaughter-pig herds has been implemented in Denmark since 1993 and 1995, respectively. All results are stored in a central database. From this, aggregated weekly results of serological and bacteriological samples collected in the period between January 1995 and July 2000 were extracted. In addition, the reported weekly incidence of human infections with S. Typhimurium covering the same time period was obtained. The times series were analysed for trends and cyclic variations by seasonal decomposition. The association between the incidence in humans and the prevalence of Salmonella in pigs and pork, and prevailing weather conditions, were analysed by using a general linear (glm) and a general additive model (gam). Explanatory variables were lagged to account for time elapsed between sampling, consumption, incubation period and case registration. The results of the seasonal decomposition showed an overall declining trend in all three time series; presumably an effect of the implemented Salmonella control measures. All time series exhibited a double peaked annual cycle. The seasonal variation of the prevalence in pork and the human incidence had a very similar course with a starting increase in the spring and a peak in August-September. The variables that were both biologically meaningful and statistically significant in both regression models were the prevalence in pork sampled 4 to 5 weeks before case registration, the seroprevalence, measured as the average prevalence of week 15 to 35 before case registration, and the air temperature lagged at 2 and 3 weeks. Limitations on inferences from overall surveillance data are discussed
- …