53 research outputs found

    Model fit after pairwise maximum likelihood

    Get PDF
    Maximum likelihood factor analysis of discrete data within the structural equation modeling framework rests on the assumption that the observed discrete responses are manifestations of underlying continuous scores that are normally distributed. As maximizing the likelihood of multivariate response patterns is computationally very intensive, the sum of the log--likelihoods of the bivariate response patterns is maximized instead. Little is yet known about how to assess model fit when the analysis is based on such a pairwise maximum likelihood (PML) of two--way contingency tables. We propose new fit criteria for the PML method and conduct a simulation study to evaluate their performance in model selection. With large sample sizes (500 or more), PML performs as well the robust weighted least squares analysis of polychoric correlations

    The potential of global coastal flood risk reduction using various DRR measures

    Get PDF
    Coastal flood risk is a serious global challenge facing current and future generations. Several disaster risk reduction (DRR) measures have been posited as ways to reduce the deleterious impacts of coastal flooding. On a global scale, however, efforts to model the future effects of DRR measures (beyond structural) are limited. In this paper, we use a global-scale flood risk model to estimate the risk of coastal flooding and to assess and compare the efficacy and economic performance of various DRR measures, namely dykes and coastal levees, dry-proofing of urban assets, zoning restrictions in flood-prone areas, and management of foreshore vegetation. To assess the efficacy of each DRR measure, we determine the extent to which it can limit future flood risk as a percentage of regional GDP to the same proportional value as today (a “relative risk constant” objective). To assess their economic performance, we estimate the economic benefits and costs of implementing each measure. If no DRR measures are implemented to mitigate future coastal flood risk, we estimate expected annual damages to exceed USD 1.3 trillion by 2080, directly affecting an estimated 11.5 million people on an annual basis. Low- and high-end scenarios reveal large ranges of impact uncertainty, especially in lower-income regions. On a global scale, we find the efficacy of dykes and coastal levees in achieving the relative risk constant objective to be 98 %, of dry-proofing to be 49 %, of zoning restrictions to be 11 %, and of foreshore vegetation to be 6 %. In terms of direct costs, the overall figure is largest for dry-proofing (USD 151 billion) and dykes and coastal levees (USD 86 billion), much more than those of zoning restrictions (USD 27 million) and foreshore vegetation (USD 366 million). These two more expensive DRR measures also exhibit the largest potential range of direct costs. While zoning restrictions and foreshore vegetation achieve the highest global benefit–cost ratios (BCRs), they also provide the smallest magnitude of overall benefit. We show that there are large regional patterns in both the efficacy and economic performance of modelled DRR measures that display much potential for flood risk reduction, especially in regions of the world that are projected to experience large amounts of population growth. Over 90 % of sub-national regions in the world can achieve their relative risk constant targets if at least one of the investigated DRR measures is employed. While future research could assess the indirect costs and benefits of these four and other DRR measures, as well as their subsequent hybridization, here we demonstrate to global and regional decision makers the case for investing in DRR now to mitigate future coastal flood risk.</p

    Як уникнути підйому рівня води?

    Get PDF
    East Africa’s Lake Victoria provides resources and services to millions of people on the lake’s shores and abroad. In particular, the lake’s fisheries are an important source of protein, employment, and international economic connections for the whole region. Nonetheless, stock dynamics are poorly understood and currently unpredictable. Furthermore, fishery dynamics are intricately connected to other supporting services of the lake as well as to lakeshore societies and economies. Much research has been carried out piecemeal on different aspects of Lake Victoria’s system; e.g., societies, biodiversity, fisheries, and eutrophication. However, to disentangle drivers and dynamics of change in this complex system, we need to put these pieces together and analyze the system as a whole. We did so by first building a qualitative model of the lake’s social-ecological system. We then investigated the model system through a qualitative loop analysis, and finally examined effects of changes on the system state and structure. The model and its contextual analysis allowed us to investigate system-wide chain reactions resulting from disturbances. Importantly, we built a tool that can be used to analyze the cascading effects of management options and establish the requirements for their success. We found that high connectedness of the system at the exploitation level, through fisheries having multiple target stocks, can increase the stocks’ vulnerability to exploitation but reduce society’s vulnerability to variability in individual stocks. We describe how there are multiple pathways to any change in the system, which makes it difficult to identify the root cause of changes but also broadens the management toolkit. Also, we illustrate how nutrient enrichment is not a self-regulating process, and that explicit management is necessary to halt or reverse eutrophication. This model is simple and usable to assess system-wide effects of management policies, and can serve as a paving stone for future quantitative analyses of system dynamics at local scales

    Secondary production and energetics of the shrimp Caridina nilotica in Lake Victoria, East Africa: model development and application

    Full text link
    Measurements of body mass, carbon content, respiration, growth, and egestion are combined in a model of secondary production by the tropical freshwater shrimp Caridina . The model is developed to permit its direct application to empirical data for abundances and size frequency distributions of field populations. Model calculations combined with population data for offshore Lake Victoria over a period of two years indicate that Caridina consume the equivalent of 2.2% of annual lake primary production. Present net annual secondary production by the shrimp is an order of magnitude greater than the present fishery yield of the lake. Detritus-fed experimental organisms evidently had assimilation efficiencies as low as 10% by model calculation.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/42892/1/10750_2004_Article_BF00031923.pd

    Calibrating ADL-IADL scales to improve measurement accuracy and to extend the disability construct into the preclinical range: a systematic review

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Interest in measuring functional status among nondisabled older adults has increased in recent years. This is, in part, due to the notion that adults identified as 'high risk' for functional decline portray a state that is potentially easier to reverse than overt disability. Assessing relatively healthy older adults with traditional self-report measures (activities of daily living) has proven difficult because these instruments were initially developed for institutionalised older adults. Perhaps less evident, are problems associated with change scores and the potential for 'construct under-representation', which reflects the exclusion of important features of the construct (e.g., disability). Furthermore, establishing a formal hierarchy of functional status tells more than the typical simple summation of functional loss, and may have predictive value to the clinician monitoring older adults: if the sequence task difficulty is accelerated or out of order it may indicate the need for interventions.</p> <p>Methods</p> <p>This review identified studies that employed item response theory (IRT) to examine or revise functional status scales. IRT can be used to transform the ordinal nature of functional status scales to interval level data, which serves to increase diagnostic precision and sensitivity to clinical change. Furthermore, IRT can be used to rank items unequivocally along a hierarchy based on difficulty. It should be noted that this review is not concerned with contrasting IRT with more traditional classical test theory methodology.</p> <p>Results</p> <p>A systematic search of four databases (PubMed, Embase, CINAHL, and PsychInfo) resulted in the review of 2,192 manuscripts. Of these manuscripts, twelve met our inclusion/exclusion requirements and thus were targeted for further inspection.</p> <p>Conclusions</p> <p>Manuscripts presented in this review appear to summarise gerontology's best efforts to improve construct validity and content validity (i.e., ceiling effects) for scales measuring the early stages of activity restriction in community-dwelling older adults. Several scales in this review were exceptional at reducing ceiling effects, reducing gaps in coverage along the construct, as well as establishing a formal hierarchy of functional decline. These instrument modifications make it plausible to detect minor changes in difficulty for IADL items positioned at the edge of the disability continuum, which can be used to signal the onset of progressive type disability in older adults.</p

    Item response theory analysis of cognitive tests in people with dementia:a systematic review

    Get PDF
    BACKGROUND: Performance on psychometric tests is key to diagnosis and monitoring treatment of dementia. Results are often reported as a total score, but there is additional information in individual items of tests which vary in their difficulty and discriminatory value. Item difficulty refers to an ability level at which the probability of responding correctly is 50%. Discrimination is an index of how well an item can differentiate between patients of varying levels of severity. Item response theory (IRT) analysis can use this information to examine and refine measures of cognitive functioning. This systematic review aimed to identify all published literature which had applied IRT to instruments assessing global cognitive function in people with dementia. METHODS: A systematic review was carried out across Medline, Embase, PsychInfo and CINHAL articles. Search terms relating to IRT and dementia were combined to find all IRT analyses of global functioning scales of dementia. RESULTS: Of 384 articles identified four studies met inclusion criteria including a total of 2,920 people with dementia from six centers in two countries. These studies used three cognitive tests (MMSE, ADAS-Cog, BIMCT) and three IRT methods (Item Characteristic Curve analysis, Samejima’s graded response model, the 2-Parameter Model). Memory items were most difficult. Naming the date in the MMSE and memory items, specifically word recall, of the ADAS-cog were most discriminatory. CONCLUSIONS: Four published studies were identified which used IRT on global cognitive tests in people with dementia. This technique increased the interpretative power of the cognitive scales, and could be used to provide clinicians with key items from a larger test battery which would have high predictive value. There is need for further studies using IRT in a wider range of tests involving people with dementia of different etiology and severity

    Model fit after pairwise maximum likelihood

    No full text
    corecore