11 research outputs found

    Memory Complaint Profiles in Dementia Populations Utilizing the Memory Complaints Inventory

    Get PDF
    The Memory Complaints Inventory (MCI) is a self-report questionnaire developed by Paul Green to provide further effort-related evidence in neuropsychiatric practice. It is comprised of nine subscale scores, in addition to the imbedded Plausible and Implausible symptom validity scales. The current study utilized archival MCI scores in dementia populations to determine the presence of, and difference between, genuine memory impairment profiles in separate subgroups of cognitive impairment. The study sample consisted of 244 adults presenting to an outpatient neuropsychology practice for evaluation of memory impairment. The diagnostic categories of the sample consisted of Alzheimer’s Disease (n = 21), Vascular Dementia (n = 33), Mild Cognitive Impairment (n = 53), Pseudodementia (n = 88), and Poor Effort (n = 49). Results indicated significant differences in all twelve one-way ANOVAs to represent differences between subgroups on each memory-related subscale of the MCI, the overall MCI score, and the imbedded Plausible and Implausible validity scales. Post-hoc analyses revealed large differences between the dementia categories and the Poor Effort subgroup, providing further evidence for the use of the MCI as a symptom validity measure due to its ability to differentiate between poor effort and genuine neurological impairment. Further support of the study’s findings would result in reliable genuine memory impairment profiles to provide further diagnostic and prognostic specificity in general medical practice settings

    Does the delivery matter? Examining randomization at the item level

    Full text link
    Scales that are psychometrically sound, meaning those that meet established standards regarding reliability and validity when measuring one or more constructs of interest, are customarily evaluated based on a set modality (i.e., computer or paper) and administration (fixed-item order). Deviating from an established administration profile could result in non-equivalent response patterns, indicating the possible evaluation of a dissimilar construct. Randomizing item administration may alter or eliminate these effects. Therefore, we examined the differences in scale relationships for randomized and nonrandomized computer delivery for two scales measuring meaning/purpose in life. These scales have questions about suicidality, depression, and life goals that may cause item reactivity (i.e., a changed response to a second item based on the answer to the first item). Results indicated that item randomization does not alter scale psychometrics for meaning in life scales, which implies that results are comparable even if researchers implement different delivery modalities

    NWChem: Past, present, and future

    Get PDF
    Specialized computational chemistry packages have permanently reshaped the landscape of chemical and materials science by providing tools to support and guide experimental efforts and for the prediction of atomistic and electronic properties. In this regard, electronic structure packages have played a special role by using first-principle-driven methodologies to model complex chemical and materials processes. Over the past few decades, the rapid development of computing technologies and the tremendous increase in computational power have offered a unique chance to study complex transformations using sophisticated and predictive many-body techniques that describe correlated behavior of electrons in molecular and condensed phase systems at different levels of theory. In enabling these simulations, novel parallel algorithms have been able to take advantage of computational resources to address the polynomial scaling of electronic structure methods. In this paper, we briefly review the NWChem computational chemistry suite, including its history, design principles, parallel tools, current capabilities, outreach, and outlook

    NWChem

    Get PDF
    Specialized computational chemistry packages have permanently reshaped the landscape of chemical and materials science by providing tools to support and guide experimental efforts and for the prediction of atomistic and electronic properties. In this regard, electronic structure packages have played a special role by using first-principle-driven methodologies to model complex chemical and materials processes. Over the past few decades, the rapid development of computing technologies and the tremendous increase in computational power have offered a unique chance to study complex transformations using sophisticated and predictive many-body techniques that describe correlated behavior of electrons in molecular and condensed phase systems at different levels of theory. In enabling these simulations, novel parallel algorithms have been able to take advantage of computational resources to address the polynomial scaling of electronic structure methods. In this paper, we briefly review the NWChem computational chemistry suite, including its history, design principles, parallel tools, current capabilities, outreach, and outlook.Peer reviewe
    corecore