11 research outputs found
Memory Complaint Profiles in Dementia Populations Utilizing the Memory Complaints Inventory
The Memory Complaints Inventory (MCI) is a self-report questionnaire developed by Paul Green to provide further effort-related evidence in neuropsychiatric practice. It is comprised of nine subscale scores, in addition to the imbedded Plausible and Implausible symptom validity scales. The current study utilized archival MCI scores in dementia populations to determine the presence of, and difference between, genuine memory impairment profiles in separate subgroups of cognitive impairment. The study sample consisted of 244 adults presenting to an outpatient neuropsychology practice for evaluation of memory impairment. The diagnostic categories of the sample consisted of Alzheimer’s Disease (n = 21), Vascular Dementia (n = 33), Mild Cognitive Impairment (n = 53), Pseudodementia (n = 88), and Poor Effort (n = 49). Results indicated significant differences in all twelve one-way ANOVAs to represent differences between subgroups on each memory-related subscale of the MCI, the overall MCI score, and the imbedded Plausible and Implausible validity scales. Post-hoc analyses revealed large differences between the dementia categories and the Poor Effort subgroup, providing further evidence for the use of the MCI as a symptom validity measure due to its ability to differentiate between poor effort and genuine neurological impairment. Further support of the study’s findings would result in reliable genuine memory impairment profiles to provide further diagnostic and prognostic specificity in general medical practice settings
Does the delivery matter? Examining randomization at the item level
Scales that are psychometrically sound, meaning those that meet established standards regarding reliability and validity when measuring one or more constructs of interest, are customarily evaluated based on a set modality (i.e., computer or paper) and administration (fixed-item order). Deviating from an established administration profile could result in non-equivalent response patterns, indicating the possible evaluation of a dissimilar construct. Randomizing item administration may alter or eliminate these effects. Therefore, we examined the differences in scale relationships for randomized and nonrandomized computer delivery for two scales measuring meaning/purpose in life. These scales have questions about suicidality, depression, and life goals that may cause item reactivity (i.e., a changed response to a second item based on the answer to the first item). Results indicated that item randomization does not alter scale psychometrics for meaning in life scales, which implies that results are comparable even if researchers implement different delivery modalities
Recommended from our members
NWChem: Past, present, and future.
Specialized computational chemistry packages have permanently reshaped the landscape of chemical and materials science by providing tools to support and guide experimental efforts and for the prediction of atomistic and electronic properties. In this regard, electronic structure packages have played a special role by using first-principle-driven methodologies to model complex chemical and materials processes. Over the past few decades, the rapid development of computing technologies and the tremendous increase in computational power have offered a unique chance to study complex transformations using sophisticated and predictive many-body techniques that describe correlated behavior of electrons in molecular and condensed phase systems at different levels of theory. In enabling these simulations, novel parallel algorithms have been able to take advantage of computational resources to address the polynomial scaling of electronic structure methods. In this paper, we briefly review the NWChem computational chemistry suite, including its history, design principles, parallel tools, current capabilities, outreach, and outlook
Recommended from our members
NWChem: Past, present, and future
Specialized computational chemistry packages have permanently reshaped the landscape of chemical and materials science by providing tools to support and guide experimental efforts and for the prediction of atomistic and electronic properties. In this regard, electronic structure packages have played a special role by using first-principle-driven methodologies to model complex chemical and materials processes. Over the past few decades, the rapid development of computing technologies and the tremendous increase in computational power have offered a unique chance to study complex transformations using sophisticated and predictive many-body techniques that describe correlated behavior of electrons in molecular and condensed phase systems at different levels of theory. In enabling these simulations, novel parallel algorithms have been able to take advantage of computational resources to address the polynomial scaling of electronic structure methods. In this paper, we briefly review the NWChem computational chemistry suite, including its history, design principles, parallel tools, current capabilities, outreach, and outlook
Recommended from our members
Cell-penetrating peptides in nanodelivery of nucleic acids and drugs
The hydrophobic nature of cell membranes is one of the major obstacles in the therapeutic delivery of nucleic acids and drug-loaded nanoparticles. Cell-penetrating peptides (CPPs) have the ability to pass biological membranes and enter cells. Due to this intrinsic property, CPPs are employed as vectors for intracellular delivery of nucleic acids and nanoparticles. In this chapter, we first briefly describe the classification and uptake mechanisms of CPPs. Then, we describe the recent therapeutic applications of CPP-modified nanoparticles as drug carriers. In this context, we give an overview of covalent and noncovalent conjugation of CPPs. The second part involves the use of CPPs in nonviral delivery of nucleic acids. Although viral vectors are highly efficient systems for introducing genes, the safety issues with viral systems need to be considered. Nanoparticle-based nonviral vectors provide an attractive alternative, but their gene transfection efficiency is very low. Therefore, novel design strategies are needed to enhance the efficiency. We summarize the use of CPPs in enhancing gene transfer efficiency of nonviral vectors. Besides the clinical potential of currently known CPPs, we also discuss the limitations and the need for designing novel CPPs. © 2018 Elsevier Inc. All rights reserved
NWChem: Past, present, and future
Specialized computational chemistry packages have permanently reshaped the landscape of chemical and materials science by providing tools to support and guide experimental efforts and for the prediction of atomistic and electronic properties. In this regard, electronic structure packages have played a special role by using first-principle-driven methodologies to model complex chemical and materials processes. Over the past few decades, the rapid development of computing technologies and the tremendous increase in computational power have offered a unique chance to study complex transformations using sophisticated and predictive many-body techniques that describe correlated behavior of electrons in molecular and condensed phase systems at different levels of theory. In enabling these simulations, novel parallel algorithms have been able to take advantage of computational resources to address the polynomial scaling of electronic structure methods. In this paper, we briefly review the NWChem computational chemistry suite, including its history, design principles, parallel tools, current capabilities, outreach, and outlook
NWChem
Specialized computational chemistry packages have permanently reshaped the landscape of chemical and materials science by providing tools to support and guide experimental efforts and for the prediction of atomistic and electronic properties. In this regard, electronic structure packages have played a special role by using first-principle-driven methodologies to model complex chemical and materials processes. Over the past few decades, the rapid development of computing technologies and the tremendous increase in computational power have offered a unique chance to study complex transformations using sophisticated and predictive many-body techniques that describe correlated behavior of electrons in molecular and condensed phase systems at different levels of theory. In enabling these simulations, novel parallel algorithms have been able to take advantage of computational resources to address the polynomial scaling of electronic structure methods. In this paper, we briefly review the NWChem computational chemistry suite, including its history, design principles, parallel tools, current capabilities, outreach, and outlook.Peer reviewe