178 research outputs found
A geochemical study of the winonaites: Evidence for limited partial melting and constraints on the precursor composition
The winonaites are primitive achondrites which are associated with the IAB iron meteorites. Textural evidence implies heating to at least the Fe, Ni–FeS cotectic, but previous geochemical studies are ambiguous about the extent of silicate melting in these samples. Oxygen isotope evidence indicates that the precursor material may be related to the carbonaceous chondrites. Here we analysed a suite of winonaites for modal mineralogy and bulk major- and trace-element chemistry in order to assess the extent of thermal processing as well as constrain the precursor composition of the winonaite-IAB parent asteroid.
Modal mineralogy and geochemical data are presented for eight winonaites. Textural analysis reveals that, for our sub-set of samples, all except the most primitive winonaite (Northwest Africa 1463) reached the Fe, Ni–FeS cotectic. However, only one (Tierra Blanca) shows geochemical evidence for silicate melting processes. Tierra Blanca is interpreted as a residue of small-degree silicate melting. Our sample of Winona shows geochemical evidence for extensive terrestrial weathering. All other winonaites studied here (Fortuna, Queen Alexander Range 94535, Hammadah al Hamra 193, Pontlyfni and NWA 1463) have chondritic major-element ratios and flat CI-normalised bulk rare-earth element patterns, suggesting that most of the winonaites did not reach the silicate melting temperature. The majority of winonaites were therefore heated to a narrow temperature range of between ∼1220 (the Fe, Ni–FeS cotectic temperature) and ∼1370 K (the basaltic partial melting temperature). Silicate inclusions in the IAB irons demonstrate partial melting did occur in some parts of the parent body (Ruzicka and Hutson, 2010), thereby implying heterogeneous heat distribution within this asteroid. Together, this indicates that melting was the result of internal heating by short-lived radionuclides. The brecciated nature of the winonaites suggests that the parent body was later disrupted by a catastrophic impact, which allowed the preservation of the largely unmelted winonaites.
Despite major-element similarities to both ordinary and enstatite chondrites, trace-element analysis suggests the winonaite parent body had a carbonaceous chondrite-like precursor composition. The parent body of the winonaites was volatile-depleted relative to CI, but enriched compared to the other carbonaceous classes. The closest match are the CM chondrites; however, the specific precursor is not sampled in current meteorite collections
Quantifying tumour-infiltrating lymphocyte subsets : a practical immuno-histochemical method
Background: Efficient histological quantification of tumour-infiltrating T and B lymphocyte (TIL) subsets in archival tissues would greatly facilitate investigations of the role of TIL in human cancer biology. We sought to develop such a method. Methods: Ten ×40 digital images of 4 μ sections of 16 ductal invasive breast carcinomas immunostained for CD3, CD4, CD8, and CD20 were acquired (a total of 640 images). The number of pixels in each image matching a partition of Lab colour space corresponding to immunostained cells were counted using the ‘Color range’ and ‘Histogram’ tools in Adobe Photoshop 7. These pixel counts were converted to cell counts per mm2 using a calibration factor derived from one, two, three or all 10 images of each case/antibody combination. Results: Variations in the number of labelled pixels per immunostained cell made individual calibration for each case/antibody combination necessary. Calibration based on two fields containing the most labelled pixels gave a cell count minimally higher (+ 5.3%) than the count based on 10-field calibration, with 95% confidence limits − 14.7 to + 25.3%. As TIL density could vary up to 100-fold between cases, this accuracy and precision are acceptable. Conclusion: The methodology described offers sufficient accuracy, precision and efficiency to quantify the density of TIL sub-populations in breast cancer using commonly available software, and could be adapted to batch processing of image files
Recommended from our members
Retinal and choroidal thickness in early age-related macular degeneration
Purpose: To compare retinal thickness and choroidal thickness at increasing retinal eccentricity in individuals with early age-related macular degeneration (AMD) and in healthy controls using enhanced choroidal penetration, 3-dimensional optical coherence tomography at 1060 nm.
Design: Cross-sectional study.
Methods: Individuals with early AMD (n = 16; mean age, 71.6 ± 8.5 years) and a comparison group of healthy controls (n = 16; 67.6 ± 5.4 years) were recruited. Three-dimensional (20 degrees × 20 degrees) long-wavelength optical coherence tomography (1060 nm) images (approximately 8-μm axial resolution; 47 000 A scans/second, centered on the fovea) were obtained from all participants after pupil dilation. Retinal thickness was measured between the inner limiting membrane and the retinal pigment epithelium. Choroidal thickness was measured between the retinal pigment epithelium and the choroid–scleral interface. Thickness measurements were obtained subfoveally and at 0.5-mm intervals to a maximum of 2.0 mm nasally, temporally, superiorly, and inferiorly. The main outcome measures were retinal and choroidal thickness (measured in micrometers) at different eccentricities on vertical and horizontal meridians.
Results: Mean retinal thickness was reduced significantly in the group of participants with early AMD compared with the control group at multiple locations within 2.0 mm of the fovea. This difference was most significant at the fovea, where the mean retinal thickness of the early AMD group was 179 ± 27 μm and that of the control group was 202 ± 18 μm (P = .008). There was no significant difference in choroidal thickness between groups at any location.
Conclusions: Retinal thickness is reduced in early AMD, but choroidal thickness seems to be unaffected by the early disease process
A novel isolator-based system promotes viability of human embryos during laboratory processing
In vitro fertilisation (IVF) and related technologies are arguably the most challenging of all cell culture applications. The starting material is a single cell from which one aims to produce an embryo capable of establishing a pregnancy eventually leading to a live birth. Laboratory processing during IVF treatment requires open manipulations of gametes and embryos, which typically involves exposure to ambient conditions. To reduce the risk of cellular stress, we have developed a totally enclosed system of interlinked isolator-based workstations designed to maintain oocytes and embryos in a physiological environment throughout the IVF process. Comparison of clinical and laboratory data before and after the introduction of the new system revealed that significantly more embryos developed to the blastocyst stage in the enclosed isolator-based system compared with conventional open-fronted laminar flow hoods. Moreover, blastocysts produced in the isolator-based system contained significantly more cells and their development was accelerated. Consistent with this, the introduction of the enclosed system was accompanied by a significant increase in the clinical pregnancy rate and in the proportion of embryos implanting following transfer to the uterus. The data indicate that protection from ambient conditions promotes improved development of human embryos. Importantly, we found that it was entirely feasible to conduct all IVF-related procedures in the isolator-based workstations
Recommended from our members
The effect of bleach duration and age on the ERG photostress test
Background: The ERG photostress test assesses the recovery of the focal 41 Hz ERG following exposure to a bright light that bleaches a significant proportion of photopigment. The aims of this study were: 1) to compare the repeatability of the ERG photostress test recovery time constant following long and short duration light exposure, and 2) to determine the effect of age on the ERG photostress test recovery time constant.
Methods: Focal 41 Hz ERGs were recorded from 23 participants (age range 20–71 years) at 20-second intervals for 5 minutes following either a short-duration (photoflash) or long-duration (equilibrium) light exposure. After a 5-minute wash-out period, the procedure was repeated using the second bleach modality. The time constant of cone recovery was determined by fitting an exponential model to the amplitude recovery data. The whole procedure was repeated on a second occasion. The co-efficient of repeatability (CoR) was calculated for each bleaching technique. The relationship between the time constant of recovery and age was investigated (Pearson’s correlation coefficient).
Results: The time constant of recovery following an equilibrium bleach was more repeatable than recovery following a photoflash (CoR = 85s and 184s respectively). Eight trials (from seven participants) failed to show a reduction in amplitude following the photoflash, suggesting that a blink or fixation loss had occurred. All participants were reliably light-adapted by the equilibrium bleach. For the equilibrium bleach data, the time constant of recovery increased with age at a rate of 27 seconds per decade.
Conclusions: The equilibrium bleach was more reliable and repeatable than the photoflash. Increasing participant age was shown to result in a lengthening of the recovery time constant, of a magnitude comparable to previously published psychophysical data
Senior citizens, good practice and quality of life in residential care homes
This thesis is an examination of the definition and implementation of ‘good practice’ in residential care for senior citizens. The central contention is that ‘good practice’ is a term that has been variously defined. Different groups define it in different ways, and their definitions have changed over time. This reflexive qualitative study explores ‘good practice’ in local authority, voluntary and private residential care homes in Scotland from the perspective of policy, practice and the experience of senior citizens who live in them. The study is based on analysis of policy documents, historical studies, and reanalysed interview and survey data from two earlier studies conducted by the author and colleagues. The thesis shows that the notion of ‘good practice’ that emerges in policy and practice documents is a confused and often conflicting set of ideas. Historically, the earliest were driven by concerns over cost. In more modern times, statements about ‘good practice’ have had a more benevolent intent but are frequently flawed by paternalistic and ageist assumptions. It is shown that staff in residential homes typically adopt a different set of attitudes: their preoccupation is with safety and the avoidance of risk. Although benevolent in intention, these interpretations of ‘good practice’ are also at variance with what residents themselves actually want. Two particular models or styles of care are examined in detail. One of these is the use of ‘keyworkers’, often implemented in ways that fail to realise its potential. The other is the ‘hotel’ model of care. The potential of this model as an alternative to the statutory model is explored. The thesis concludes that it is a model that can realise the goal of enabling residents to exercise independence, choice and privacy while meeting their needs in residential care.EThOS - Electronic Theses Online ServiceGBUnited Kingdo
Recommended from our members
Detection of early age-related macular degeneration using novel functional parameters of the focal cone electroretinogram
The focal cone electroretinogram is a sensitive marker for macular disease, but have we unlocked its full potential? Typically assessment of waveform parameters is subjective and focuses on a small number of locations (e.g. the a-wave). This study evaluated the discriminatory and diagnostic potential of 4 conventional and 15 novel, objectively determined, parameters in patients with early Age-related Macular Degeneration. Focal cone electroretinograms were recorded in 54 participants with early Age-related Macular Degeneration (72.9±8.2 years) and 54 healthy controls (69±7.7 years). Conventional a and b wave amplitudes and implicit times were measured and compared to novel parameters derived from both the 1st and 2nd derivatives and the frequency-domain power spectrum of the electroretinogram.Statistically significant differences between groups were shown for all conventional parameters, the majority of 1st and 2nd derivative parameters and the power spectrum at 25 and 30 Hz. Receiver operating characteristics showed that both conventional and 1st and 2nd derivative implicit times had provided the best diagnostic potential. A regression model showed a small improvement over any individual parameter investigated. The non-conventional parameters enhanced the objective evaluation of the focal electroretinogram, especially when the amplitude was low. Furthermore, the novel parameters described here allow the implicit time of the electroretinogram to be probed at points other than the peaks of the a and b waves. Consequently these novel analysis techniques could prove valuable in future electrophysiological investigation, detection and monitoring of Age-related Macular Degeneration
Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors
Background:
Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries.
Methods:
In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants.
Findings:
45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups.
Interpretation:
Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency.
Funding:
NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation
Recommended from our members
Proteomic Analysis of Urine from California Sea Lions (Zalophus californianus): A Resource for Urinary Biomarker Discovery
Urinary markers for the assessment of kidney diseases in wild animals are limited, in part, due to the lack of urinary proteome data, especially for marine mammals. One of the most prevalent kidney diseases in marine mammals is caused by Leptospira interrogans, which is the second most common etiology linked to stranding of California sea lions ( Zalophus californianus). Urine proteins from 11 sea lions with leptospirosis kidney disease and eight sea lions without leptospirosis or kidney disease were analyzed using shotgun proteomics. In total, 2694 protein groups were identified, and 316 were differentially abundant between groups. Major urine proteins in sea lions were similar to major urine proteins in dogs and humans except for the preponderance of resistin, lysozyme C, and PDZ domain containing 1, which appear to be over-represented. Previously reported urine protein markers of kidney injury in humans and animals were also identified. Notably, neutrophil gelatinase-associated lipocalin, osteopontin, and epidermal fatty acid binding protein were elevated over 20-fold in the leptospirosis-infected sea lions. Consistent with leptospirosis infection in rodents, urinary proteins associated with the renin-angiotensin system were depressed, including neprilysin. This study represents a foundation from which to explore the clinical use of urinary protein markers in California sea lions
- …
