1,846 research outputs found
Recommended from our members
Cost and Scalability Analysis of Porcine Islet Isolation for Islet Transplantation: Comparison of Juvenile, Neonatal and Adult Pigs.
The limited availability of human islets has led to the examination of porcine islets as a source of clinically suitable tissue for transplantation in patients with diabetes mellitus. Islets from porcine donors are commonly used in both in vitro and in vivo experiments studying diabetes mellitus. However, there are significant differences in quality and quantity of islet yield depending on donor pig age, as well as substantial differences in the costs of pancreas procurement in adult versus neonatal and juvenile pigs. In this study, we compared the total cost per islet of juvenile pig pancreata with that of neonatal and adult pigs. Although adult porcine pancreata yield, on average, more than five times the amount of islets than do juvenile and neonatal pancreata, we found that the high price of adult pigs led to the cost per islet being more than twice that of juvenile and neonatal islets (US 0.04 and $0.02, respectively). In addition, neonatal and juvenile islets are advantageous in their scalability and retention of viability after culture. Our findings indicate that isolating neonatal and juvenile porcine islets is more cost-effective and scalable than isolating adult porcine islets
User-initialized active contour segmentation and golden-angle real-time cardiovascular magnetic resonance enable accurate assessment of LV function in patients with sinus rhythm and arrhythmias.
BackgroundData obtained during arrhythmia is retained in real-time cardiovascular magnetic resonance (rt-CMR), but there is limited and inconsistent evidence to show that rt-CMR can accurately assess beat-to-beat variation in left ventricular (LV) function or during an arrhythmia.MethodsMulti-slice, short axis cine and real-time golden-angle radial CMR data was collected in 22 clinical patients (18 in sinus rhythm and 4 patients with arrhythmia). A user-initialized active contour segmentation (ACS) software was validated via comparison to manual segmentation on clinically accepted software. For each image in the 2D acquisitions, slice volume was calculated and global LV volumes were estimated via summation across the LV using multiple slices. Real-time imaging data was reconstructed using different image exposure times and frame rates to evaluate the effect of temporal resolution on measured function in each slice via ACS. Finally, global volumetric function of ectopic and non-ectopic beats was measured using ACS in patients with arrhythmias.ResultsACS provides global LV volume measurements that are not significantly different from manual quantification of retrospectively gated cine images in sinus rhythm patients. With an exposure time of 95.2 ms and a frame rate of > 89 frames per second, golden-angle real-time imaging accurately captures hemodynamic function over a range of patient heart rates. In four patients with frequent ectopic contractions, initial quantification of the impact of ectopic beats on hemodynamic function was demonstrated.ConclusionUser-initialized active contours and golden-angle real-time radial CMR can be used to determine time-varying LV function in patients. These methods will be very useful for the assessment of LV function in patients with frequent arrhythmias
The effect of offering different numbers of colorectal cancer screening test options in a decision aid: a pilot randomized trial
BACKGROUND: Decision aids can improve decision making processes, but the amount and type of information that they should attempt to communicate is controversial. We sought to compare, in a pilot randomized trial, two colorectal cancer (CRC) screening decision aids that differed in the number of screening options presented. METHODS: Adults ages 48–75 not currently up to date with screening were recruited from the community and randomized to view one of two versions of our previously tested CRC screening decision aid. The first version included five screening options: fecal occult blood test (FOBT), sigmoidoscopy, a combination of FOBT and sigmoidoscopy, colonoscopy, and barium enema. The second discussed only the two most frequently selected screening options, FOBT and colonoscopy. Main outcomes were differences in screening interest and test preferences between groups after decision aid viewing. Patient test preference was elicited first without any associated out-of-pocket costs (OPC), and then with the following costs: FOBT-50, barium enema-200. RESULTS: 62 adults participated: 25 viewed the 5-option decision aid, and 37 viewed the 2-option version. Mean age was 54 (range 48–72), 58% were women, 71% were White, 24% African-American; 58% had completed at least a 4-year college degree. Comparing participants that viewed the 5-option version with participants who viewed the 2-option version, there were no differences in screening interest after viewing (1.8 vs. 1.9, t-test p = 0.76). Those viewing the 2-option version were somewhat more likely to choose colonoscopy than those viewing the 5-option version when no out of pocket costs were assumed (68% vs. 46%, p = 0.11), but not when such costs were imposed (41% vs. 42%, p = 1.00). CONCLUSION: The number of screening options available does not appear to have a large effect on interest in colorectal cancer screening. The effect of offering differing numbers of options may affect test choice when out-of-pocket costs are not considered
Lymphatic Filariasis Control in Tanzania: Effect of Six Rounds of Mass Drug Administration with Ivermectin and Albendazole on Infection and Transmission.
Control of lymphatic filariasis (LF) in most countries of sub-Saharan Africa is based on annual mass drug administration (MDA) with a combination of ivermectin and albendazole, in order to interrupt transmission. We present findings from a detailed study on the effect of six rounds of MDA with this drug combination as implemented by the National Lymphatic Filariasis Elimination Programme (NLFEP) in a highly endemic rural area of north-eastern Tanzania.\ud
The effect of treatment on transmission and human infection was monitored in a community- and a school-based study during an 8-year period (one pre-intervention and 7 post-intervention years) from 2003 to 2011. Before intervention, 24.5% of the community population had microfilariae (mf) in the blood, 53.3% had circulating filarial antigens (CFA) and 78.9% had specific antibodies to the recombinant filarial antigen Bm14. One year after the sixth MDA, these values had decreased considerably to 2.7%, 19.6% and 27.5%, respectively. During the same period, the CFA prevalence among new intakes of Standard 1 pupils in 10 primary schools decreased from 25.2% to 5.6%. In line with this, transmission by the three vectors (Anopheles gambiae, An. funestus and Culex quinquefasciatus) as determined by dissection declined sharply (overall vector infectivity rate by 99.3% and mean monthly transmission potential by 99.2% between pre-intervention and fifth post-intervention period). A major shift in vector species composition, from predominantly anopheline to almost exclusively culicine was observed over the years. This may be largely unrelated to the MDAs but may have important implications for the epidemiology of LF in the area. Six MDAs caused considerable decrease in all the measured indices for transmission and human infection. In spite of this, indices were still relatively high in the late period of the study, and it may take a long time to reach the recommended cut-off levels for interruption of transmission unless extra efforts are made. These should include increased engagement of the target population in the control activities, to ensure higher treatment coverage. It is expected that the recent initiative to distribute insecticide impregnated bed nets to every household in the area will also contribute towards reaching the goal of successful LF elimination
Training emergency services’ dispatchers to recognise stroke: an interrupted time-series analysis
Background: Stroke is a time-dependent medical emergency in which early presentation to specialist care reduces death and dependency. Up to 70% of all stroke patients obtain first medical contact from the Emergency Medical Services (EMS). Identifying ‘true stroke’ from an EMS call is challenging, with over 50% of strokes being misclassified.
The aim of this study was to evaluate the impact of the training package on the recognition of stroke by Emergency Medical Dispatchers (EMDs).
Methods: This study took place in an ambulance service and a hospital in England using an interrupted time-series
design. Suspected stroke patients were identified in one week blocks, every three weeks over an 18 month period,
during which time the training was implemented. Patients were included if they had a diagnosis of stroke (EMS or
hospital). The effect of the intervention on the accuracy of dispatch diagnosis was investigated using binomial
(grouped) logistic regression.
Results: In the Pre-implementation period EMDs correctly identified 63% of stroke patients; this increased to 80%
Post-implementation. This change was significant (p=0.003), reflecting an improvement in identifying stroke patients
relative to the Pre-implementation period both the During-implementation (OR=4.10 [95% CI 1.58 to 10.66]) and Post-implementation (OR=2.30 [95% CI 1.07 to 4.92]) periods. For patients with a final diagnosis of stroke who had been dispatched as stroke there was a marginally non-significant 2.8 minutes (95% CI −0.2 to 5.9 minutes, p=0.068)reduction between Pre- and Post-implementation periods from call to arrival of the ambulance at scene.
Conclusions: This is the first study to develop, implement and evaluate the impact of a training package for EMDs with
the aim of improving the recognition of stroke. Training led to a significant increase in the proportion of stroke patients dispatched as such by EMDs; a small reduction in time from call to arrival at scene by the ambulance also appeared likely. The training package has been endorsed by the UK Stroke Forum Education and Training, and is free to access on-line
Vascular responses of the extremities to transdermal application of vasoactive agents in Caucasian and African descent individuals
This is an accepted manuscript of an article published by Springer in European Journal of Applied Physiology on 04/04/2015, available online: https://doi.org/10.1007/s00421-015-3164-2
The accepted version of the publication may differ from the final published version.© 2015, Springer-Verlag Berlin Heidelberg. Purpose: Individuals of African descent (AFD) are more susceptible to non-freezing cold injury than Caucasians (CAU) which may be due, in part, to differences in the control of skin blood flow. We investigated the skin blood flow responses to transdermal application of vasoactive agents. Methods: Twenty-four young males (12 CAU and 12 AFD) undertook three tests in which iontophoresis was used to apply acetylcholine (ACh 1 w/v %), sodium nitroprusside (SNP 0.01 w/v %) and noradrenaline (NA 0.5 mM) to the skin. The skin sites tested were: volar forearm, non-glabrous finger and toe, and glabrous finger (pad) and toe (pad). Results: In response to SNP on the forearm, AFD had less vasodilatation for a given current application than CAU (P = 0.027–0.004). ACh evoked less vasodilatation in AFD for a given application current in the non-glabrous finger and toe compared with CAU (P = 0.043–0.014) with a lower maximum vasodilatation in the non-glabrous finger (median [interquartile], AFD n = 11, 41[234] %, CAU n = 12, 351[451] %, P = 0.011) and non-glabrous toe (median [interquartile], AFD n = 9, 116[318] %, CAU n = 12, 484[720] %, P = 0.018). ACh and SNP did not elicit vasodilatation in the glabrous skin sites of either group. There were no ethnic differences in response to NA. Conclusion: AFD have an attenuated endothelium-dependent vasodilatation in non-glabrous sites of the fingers and toes compared with CAU. This may contribute to lower skin temperature following cold exposure and the increased risk of cold injuries experienced by AFD.Published versio
Change in Composition of the Anopheles Gambiae Complex and its Possible Implications for the Transmission of Malaria and Lymphatic Filariasis in North-Eastern Tanzania.
A dramatic decline in the incidence of malaria due to Plasmodium falciparum infection in coastal East Africa has recently been reported to be paralleled (or even preceded) by an equally dramatic decline in malaria vector density, despite absence of organized vector control. As part of investigations into possible causes for the change in vector population density, the present study analysed the Anopheles gambiae s.l. sibling species composition in north-eastern Tanzania. The study was in two parts. The first compared current species complex composition in freshly caught An. gambiae s.l. complex from three villages to the composition reported from previous studies carried out 2-4 decades ago in the same villages. The second took advantage of a sample of archived dried An. gambiae s.l. complex specimens collected regularly from a fourth study village since 2005. Both fresh and archived dried specimens were identified to sibling species of the An. gambiae s.l. complex by PCR. The same specimens were moreover examined for Plasmodium falciparum and Wuchereria bancrofti infection by PCR. As in earlier studies, An. gambiae s.s., Anopheles merus and Anopheles arabiensis were identified as sibling species found in the area. However, both study parts indicated a marked change in sibling species composition over time. From being by far the most abundant in the past An. gambiae s.s. was now the most rare, whereas An. arabiensis had changed from being the most rare to the most common. P. falciparum infection was rarely detected in the examined specimens (and only in An. arabiensis) whereas W. bancrofti infection was prevalent and detected in all three sibling species. The study indicates that a major shift in An. gambiae s.l. sibling species composition has taken place in the study area in recent years. Combined with the earlier reported decline in overall malaria vector density, the study suggests that this decline has been most marked for An. gambiae s.s., and least for An. arabiensis, leading to current predominance of the latter. Due to differences in biology and vectorial capacity of the An. gambiae s.l. complex the change in sibling species composition will have important implications for the epidemiology and control of malaria and lymphatic filariasis in the study area
Apraxia and motor dysfunction in corticobasal syndrome
Background: Corticobasal syndrome (CBS) is characterized by multifaceted motor system dysfunction and cognitive disturbance; distinctive clinical features include limb apraxia and visuospatial dysfunction. Transcranial magnetic stimulation (TMS) has been used to study motor system dysfunction in CBS, but the relationship of TMS parameters to clinical features has not been studied. The present study explored several hypotheses; firstly, that limb apraxia may be partly due to visuospatial impairment in CBS. Secondly, that motor system dysfunction can be demonstrated in CBS, using threshold-tracking TMS, and is linked to limb apraxia. Finally, that atrophy of the primary motor cortex, studied using voxel-based morphometry analysis (VBM), is associated with motor system dysfunction and limb apraxia in CBS. Methods: Imitation of meaningful and meaningless hand gestures was graded to assess limb apraxia, while cognitive performance was assessed using the Addenbrooke's Cognitive Examination - Revised (ACE-R), with particular emphasis placed on the visuospatial subtask. Patients underwent TMS, to assess cortical function, and VBM. Results: In total, 17 patients with CBS (7 male, 10 female; mean age 64.4+/2 6.6 years) were studied and compared to 17 matched control subjects. Of the CBS patients, 23.5% had a relatively inexcitable motor cortex, with evidence of cortical dysfunction in the remaining 76.5% patients. Reduced resting motor threshold, and visuospatial performance, correlated with limb apraxia. Patients with a resting motor threshold <50% performed significantly worse on the visuospatial sub-task of the ACE-R than other CBS patients. Cortical function correlated with atrophy of the primary and pre-motor cortices, and the thalamus, while apraxia correlated with atrophy of the pre-motor and parietal cortices. Conclusions: Cortical dysfunction appears to underlie the core clinical features of CBS, and is associated with atrophy of the primary motor and pre-motor cortices, as well as the thalamus, while apraxia correlates with pre-motor and parietal atrophy
Evaluation of the current knowledge limitations in breast cancer research: a gap analysis
BACKGROUND
A gap analysis was conducted to determine which areas of breast cancer research, if targeted by researchers and funding bodies, could produce the greatest impact on patients.
METHODS
Fifty-six Breast Cancer Campaign grant holders and prominent UK breast cancer researchers participated in a gap analysis of current breast cancer research. Before, during and following the meeting, groups in seven key research areas participated in cycles of presentation, literature review and discussion. Summary papers were prepared by each group and collated into this position paper highlighting the research gaps, with recommendations for action.
RESULTS
Gaps were identified in all seven themes. General barriers to progress were lack of financial and practical resources, and poor collaboration between disciplines. Critical gaps in each theme included: (1) genetics (knowledge of genetic changes, their effects and interactions); (2) initiation of breast cancer (how developmental signalling pathways cause ductal elongation and branching at the cellular level and influence stem cell dynamics, and how their disruption initiates tumour formation); (3) progression of breast cancer (deciphering the intracellular and extracellular regulators of early progression, tumour growth, angiogenesis and metastasis); (4) therapies and targets (understanding who develops advanced disease); (5) disease markers (incorporating intelligent trial design into all studies to ensure new treatments are tested in patient groups stratified using biomarkers); (6) prevention (strategies to prevent oestrogen-receptor negative tumours and the long-term effects of chemoprevention for oestrogen-receptor positive tumours); (7) psychosocial aspects of cancer (the use of appropriate psychosocial interventions, and the personal impact of all stages of the disease among patients from a range of ethnic and demographic backgrounds).
CONCLUSION
Through recommendations to address these gaps with future research, the long-term benefits to patients will include: better estimation of risk in families with breast cancer and strategies to reduce risk; better prediction of drug response and patient prognosis; improved tailoring of treatments to patient subgroups and development of new therapeutic approaches; earlier initiation of treatment; more effective use of resources for screening populations; and an enhanced experience for people with or at risk of breast cancer and their families. The challenge to funding bodies and researchers in all disciplines is to focus on these gaps and to drive advances in knowledge into improvements in patient care
Self-reported losses versus actual losses in online gambling: an empirical study
Many research findings in the gambling studies field rely on self-report data. A very small body of empirical research also suggests that when using self-report, players report their gambling losses inaccurately. The aim of the present study was to evaluate the differences between objective and subjective gambling spent data by comparing gambler’s actual behavioral tracking data with their self-report data over a 1-month period. A total of 17,742 Norwegian online gamblers were asked to participate in an online survey. Of those surveyed, 1335 gamblers answered questions relating to gambling expenditure that could be compared with their actual gambling behavior. The study found that the estimated loss self-reported by gamblers was correlated with the actual objective loss and that players with higher losses tended to have more difficulty estimating their gambling expenditure (i.e., players who spent more money gambling also appeared to have more trouble estimating their expenses accurately). Overall, the findings demonstrate that caution is warranted when using self-report data relating to amount of money spent gambling in any studies that are totally reliant on self-report data
- …
