269 research outputs found

    Physical function and associations with diet and exercise: Results of a cross-sectional survey among elders with breast or prostate cancer

    Get PDF
    BACKGROUND: Functional decline threatens independent living and is common among individuals diagnosed with cancer, especially those who are elderly. The purpose of this study was to explore whether dietary and exercise practices are associated with physical function status among older cancer survivors. METHODS: Mailed surveys were used to ascertain data on physical function, dietary fat, fruit and vegetable (F&V) consumption, and exercise among elderly diagnosed with early stage (I-II) breast (N = 286) or prostate cancer (N = 402) within the past 18 months. RESULTS: Sixty-one percent of respondents reported diets with <30% of energy from fat, 20.4% reported F&V intakes of 5+ daily servings, and 44.6% reported regular vigorous exercise. Significant, independent associations were found between physical functioning and reported dietary fat intake, F&V consumption, and exercise. A simultaneous multiple regression model controlled for age, race, gender, time since diagnosis and concurrent health behaviors yielded the following estimates: (1) 0.2 increase in the SF-36 physical function subscale (PFS) score with each reported 1% decrease in percent energy from fat (p < .0001); (2) 0.9 increase in the SF-36 PFS score for each reported serving of F&V/day (p = .0049); and (3) 15.4 increase in the SF-36 PFS score with a positive response for regular vigorous exercise (p < .0001). CONCLUSIONS: Results of this cross-sectional survey suggest that regular vigorous exercise and consumption of diets low in fat and rich in F&Vs are associated with higher levels of physical functioning among older cancer survivors. Interventions that promote healthful lifestyle change may deliver considerable benefit within this ever increasing and vulnerable population

    A New Approach to Analyzing Patterns of Collaboration in Co-authorship Networks - Mesoscopic Analysis and Interpretation

    Full text link
    This paper focuses on methods to study patterns of collaboration in co-authorship networks at the mesoscopic level. We combine qualitative methods (participant interviews) with quantitative methods (network analysis) and demonstrate the application and value of our approach in a case study comparing three research fields in chemistry. A mesoscopic level of analysis means that in addition to the basic analytic unit of the individual researcher as node in a co-author network, we base our analysis on the observed modular structure of co-author networks. We interpret the clustering of authors into groups as bibliometric footprints of the basic collective units of knowledge production in a research specialty. We find two types of coauthor-linking patterns between author clusters that we interpret as representing two different forms of cooperative behavior, transfer-type connections due to career migrations or one-off services rendered, and stronger, dedicated inter-group collaboration. Hence the generic coauthor network of a research specialty can be understood as the overlay of two distinct types of cooperative networks between groups of authors publishing in a research specialty. We show how our analytic approach exposes field specific differences in the social organization of research.Comment: An earlier version of the paper was presented at ISSI 2009, 14-17 July, Rio de Janeiro, Brazil. Revised version accepted on 2 April 2010 for publication in Scientometrics. Removed part on node-role connectivity profile analysis after finding error in calculation and deciding to postpone analysis

    From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument

    Get PDF
    &lt;b&gt;Background&lt;/b&gt; Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Methods&lt;/b&gt; A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Results&lt;/b&gt; The developed instrument was pre-tested in two professional samples (N = 46; N = 231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Conclusions&lt;/b&gt; To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study

    Mapping language function with task-based vs. resting-state functional MRI

    Get PDF
    BACKGROUND: Use of functional MRI (fMRI) in pre-surgical planning is a non-invasive method for pre-operative functional mapping for patients with brain tumors, especially tumors located near eloquent cortex. Currently, this practice predominantly involves task-based fMRI (T-fMRI). Resting state fMRI (RS-fMRI) offers an alternative with several methodological advantages. Here, we compare group-level analyses of RS-fMRI vs. T-fMRI as methods for language localization. PURPOSE: To contrast RS-fMRI vs. T-fMRI as techniques for localization of language function. METHODS: We analyzed data obtained in 35 patients who had both T-fMRI and RS-fMRI scans during the course of pre-surgical evaluation. The RS-fMRI data were analyzed using a previously trained resting-state network classifier. The T-fMRI data were analyzed using conventional techniques. Group-level results obtained by both methods were evaluated in terms of two outcome measures: (1) inter-subject variability of response magnitude and (2) sensitivity/specificity analysis of response topography, taking as ground truth previously reported maps of the language system based on intraoperative cortical mapping as well as meta-analytic maps of language task fMRI responses. RESULTS: Both fMRI methods localized major components of the language system (areas of Broca and Wernicke) although not with equal inter-subject consistency. Word-stem completion T-fMRI strongly activated Broca\u27s area but also several task-general areas not specific to language. RS-fMRI provided a more specific representation of the language system. CONCLUSION: We demonstrate several advantages of classifier-based mapping of language representation in the brain. Language T-fMRI activated task-general (i.e., not language-specific) functional systems in addition to areas of Broca and Wernicke. In contrast, classifier-based analysis of RS-fMRI data generated maps confined to language-specific regions of the brain

    Improving the normalization of complex interventions: measure development based on normalization process theory (NoMAD): study protocol

    Get PDF
    &lt;b&gt;Background&lt;/b&gt; Understanding implementation processes is key to ensuring that complex interventions in healthcare are taken up in practice and thus maximize intended benefits for service provision and (ultimately) care to patients. Normalization Process Theory (NPT) provides a framework for understanding how a new intervention becomes part of normal practice. This study aims to develop and validate simple generic tools derived from NPT, to be used to improve the implementation of complex healthcare interventions.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Objectives&lt;/b&gt; The objectives of this study are to: develop a set of NPT-based measures and formatively evaluate their use for identifying implementation problems and monitoring progress; conduct preliminary evaluation of these measures across a range of interventions and contexts, and identify factors that affect this process; explore the utility of these measures for predicting outcomes; and develop an online users’ manual for the measures.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Methods&lt;/b&gt; A combination of qualitative (workshops, item development, user feedback, cognitive interviews) and quantitative (survey) methods will be used to develop NPT measures, and test the utility of the measures in six healthcare intervention settings.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Discussion&lt;/b&gt; The measures developed in the study will be available for use by those involved in planning, implementing, and evaluating complex interventions in healthcare and have the potential to enhance the chances of their implementation, leading to sustained changes in working practices

    Individualization as driving force of clustering phenomena in humans

    Get PDF
    One of the most intriguing dynamics in biological systems is the emergence of clustering, the self-organization into separated agglomerations of individuals. Several theories have been developed to explain clustering in, for instance, multi-cellular organisms, ant colonies, bee hives, flocks of birds, schools of fish, and animal herds. A persistent puzzle, however, is clustering of opinions in human populations. The puzzle is particularly pressing if opinions vary continuously, such as the degree to which citizens are in favor of or against a vaccination program. Existing opinion formation models suggest that "monoculture" is unavoidable in the long run, unless subsets of the population are perfectly separated from each other. Yet, social diversity is a robust empirical phenomenon, although perfect separation is hardly possible in an increasingly connected world. Considering randomness did not overcome the theoretical shortcomings so far. Small perturbations of individual opinions trigger social influence cascades that inevitably lead to monoculture, while larger noise disrupts opinion clusters and results in rampant individualism without any social structure. Our solution of the puzzle builds on recent empirical research, combining the integrative tendencies of social influence with the disintegrative effects of individualization. A key element of the new computational model is an adaptive kind of noise. We conduct simulation experiments to demonstrate that with this kind of noise, a third phase besides individualism and monoculture becomes possible, characterized by the formation of metastable clusters with diversity between and consensus within clusters. When clusters are small, individualization tendencies are too weak to prohibit a fusion of clusters. When clusters grow too large, however, individualization increases in strength, which promotes their splitting.Comment: 12 pages, 4 figure

    Gold nanoparticle-enhanced X-ray microtomography of the rodent reveals region-specific cerebrospinal fluid circulation in the brain

    Get PDF
    Cerebrospinal fluid (CSF) is essential for the development and function of the central nervous system (CNS). However, the brain and its interstitium have largely been thought of as a single entity through which CSF circulates, and it is not known whether specific cell populations within the CNS preferentially interact with the CSF. Here, we develop a technique for CSF tracking, gold nanoparticle-enhanced X-ray microtomography, to achieve micrometer-scale resolution visualization of CSF circulation patterns during development. Using this method and subsequent histological analysis in rodents, we identify previously uncharacterized CSF pathways from the subarachnoid space (particularly the basal cisterns) that mediate CSF-parenchymal interactions involving 24 functional-anatomic cell groupings in the brain and spinal cord. CSF distribution to these areas is largely restricted to early development and is altered in posthemorrhagic hydrocephalus. Our study also presents particle size-dependent CSF circulation patterns through the CNS including interaction between neurons and small CSF tracers, but not large CSF tracers. These findings have implications for understanding the biological basis of normal brain development and the pathogenesis of a broad range of disease states, including hydrocephalus
    corecore