634 research outputs found

    Book Reviews

    Get PDF

    Source of Lake Vostok Cations Constrained with Strontium Isotopes

    Get PDF
    Lake Vostok is the largest sub-glacial lake in Antarctica. The primary source of our current knowledge regarding the geochemistry and biology of the lake comes from the analysis of refrozen lake water associated with ice core drilling. Several sources of dissolved ions and particulate matter to the lake have been proposed, including materials from the melted glacier ice, the weathering of underlying geological materials, hydrothermal activity and underlying, ancient evaporitic deposits. A sample of Lake Vostok Type 1 accretion ice has been analyzed for its 87Sr/86Sr signature as well as its major cation and anion and Sr concentrations. The strontium isotope ratio of 0.71655 and the Ca/Sr ratio in the sample strongly indicate that the major source of the Sr is from aluminosilicate minerals from the continental crust. These data imply that at least a portion of the other cations in the Type 1 ice also are derived from continental crustal materials and not hydrothermal activity, the melted glacier ice, or evaporitic sources

    First Record of a Sleeper Shark in the Western Gulf of Mexico and Comments on Taxonomic Uncertainty Within Somniosus (Somniosus)

    Get PDF
    A sleeper shark, Somniosus (Somniosus) sp., is reported from Alaminos Canyon in the western Gulf of Mexico at a depth of about 2647 m based on observations made using a remotely operated vehicle. This is the first record of a sleeper shark (Somniosus, Somniosidae) from the western Gulf of Mexico and deepest record of any shark from the Gulf of Mexico. Despite claims to the contrary in the literature, no taxonomic character has been identified to date that can be used to unequivocally identify all representatives of Somniosus (Somniosus), and as a result, some species records must be considered dubious

    Evidence for conservation in antigen gene sequences combined with extensive polymorphism at VNTR loci

    Get PDF
    Theileria parva is a tick‐transmitted apicomplexan protozoan parasite that infects lymphocytes of cattle and African Cape buffalo (Syncerus caffer), causing a frequently fatal disease of cattle in eastern, central and southern Africa. A live vaccination procedure, known as infection and treatment method (ITM), the most frequently used version of which comprises the Muguga, Serengeti‐transformed and Kiambu 5 stocks of T. parva, delivered as a trivalent cocktail, is generally effective. However, it does not always induce 100% protection against heterologous parasite challenge. Knowledge of the genetic diversity of T. parva in target cattle populations is therefore important prior to extensive vaccine deployment. This study investigated the extent of genetic diversity within T. parva field isolates derived from Ankole (Bos taurus) cattle in south‐western Uganda using 14 variable number tandem repeat (VNTR) satellite loci and the sequences of two antigen‐encoding genes that are targets of CD8+T‐cell responses induced by ITM, designated Tp1 and Tp2. The findings revealed a T. parva prevalence of 51% confirming endemicity of the parasite in south‐western Uganda. Cattle‐derived T. parva VNTR genotypes revealed a high degree of polymorphism. However, all of the T. parva Tp1 and Tp2 alleles identified in this study have been reported previously, indicating that they are widespread geographically in East Africa and highly conserved

    Probabilistic classification of acute myocardial infarction from multiple cardiac markers

    Get PDF
    Logistic regression and Gaussian mixture model (GMM) classifiers have been trained to estimate the probability of acute myocardial infarction (AMI) in patients based upon the concentrations of a panel of cardiac markers. The panel consists of two new markers, fatty acid binding protein (FABP) and glycogen phosphorylase BB (GPBB), in addition to the traditional cardiac troponin I (cTnI), creatine kinase MB (CKMB) and myoglobin. The effect of using principal component analysis (PCA) and Fisher discriminant analysis (FDA) to preprocess the marker concentrations was also investigated. The need for classifiers to give an accurate estimate of the probability of AMI is argued and three categories of performance measure are described, namely discriminatory ability, sharpness, and reliability. Numerical performance measures for each category are given and applied. The optimum classifier, based solely upon the samples take on admission, was the logistic regression classifier using FDA preprocessing. This gave an accuracy of 0.85 (95% confidence interval: 0.78–0.91) and a normalised Brier score of 0.89. When samples at both admission and a further time, 1–6 h later, were included, the performance increased significantly, showing that logistic regression classifiers can indeed use the information from the five cardiac markers to accurately and reliably estimate the probability AMI

    Professional Courses for Teacher Certification

    Get PDF
    The three lectures comprising this document are introduced with a foreword by Clifford L. Bishop and an introduction by William H. Dreier; both of the Department of Education and Psychology which sponsored the Central State Colleges and Universities (CSCU) seminar on the professional program for undergraduates leading to teacher certification and the B.A. or B.S. degree. The lecture by George W. Denemark presents A Proposed Common Professional Core for the Preparation of Teachers. He includes discussion of the context for curriculum planning and of the broad range of objectives for teacher education. In his discussion of Ideal Experiences Needed in the First Course for Undergraduates; Henry J. Hermanowicz deals with the newer systematic and descriptive studies of teaching; experiments in clinical studies of teaching by prospective teachers; and the emergence of theories of teaching. William E. Drake\u27s Needed Experiences in the Foundations Professional Sequence Course includes justification for a social philosophy course and discussion of content necessary to meet the minimum professional standards and classroom activity conducive to quality professional experience. Included are bibliographies; the major comments made at the final panel discussion; and a list of the seminar participants (20 from the host institution and 50 from 25 institutions in 12 different states). (JS

    Core-Shell Hydrogel Particles Harvest, Concentrate and Preserve Labile Low Abundance Biomarkers

    Get PDF
    Background: The blood proteome is thought to represent a rich source of biomarkers for early stage disease detection. Nevertheless, three major challenges have hindered biomarker discovery: a) candidate biomarkers exist at extremely low concentrations in blood; b) high abundance resident proteins such as albumin mask the rare biomarkers; c) biomarkers are rapidly degraded by endogenous and exogenous proteinases. Methodology and Principal Findings: Hydrogel nanoparticles created with a N-isopropylacrylamide based core (365 nm)-shell (167 nm) and functionalized with a charged based bait (acrylic acid) were studied as a technology for addressing all these biomarker discovery problems, in one step, in solution. These harvesting core-shell nanoparticles are designed to simultaneously conduct size exclusion and affinity chromatography in solution. Platelet derived growth factor (PDGF), a clinically relevant, highly labile, and very low abundance biomarker, was chosen as a model. PDGF, spiked in human serum, was completely sequestered from its carrier protein albumin, concentrated, and fully preserved, within minutes by the particles. Particle sequestered PDGF was fully protected from exogenously added tryptic degradation. When the nanoparticles were added to a 1 mL dilute solution of PDGF at non detectable levels (less than 20 picograms per mL) the concentration of the PDGF released from the polymeric matrix of the particles increased within the detection range of ELISA and mass spectrometry. Beyond PDGF, the sequestration and protection from degradation for a series of additional very low abundance and very labile cytokines were verified. Conclusions and Significance: We envision the application of harvesting core-shell nanoparticles to whole blood for concentration and immediate preservation of low abundance and labile analytes at the time of venipuncture. © 2009 Longo et al

    Classification of CT brain images based on deep learning networks

    Get PDF
    While Computerised Tomography (CT) may have been the first imag-ing tool to study human brain, it has not yet been implemented into clinical decision making process for diagnosis of Alzheimers disease (AD). On the other hand, with the nature of being prevalent, inexpensive and non-invasive, CT does present diagnostic features of AD to a great ex-tent. This study explores the significance and impact on the application of the burgeoning deep learning techniques to the task of classification of CT brain images, in particular utilising convolutional neural network (CNN), aiming at providing supplementary information for the early di-agnosis of Alzheimers disease. Towards this end, three categories of CT images (N=285) are clustered into three groups, which are AD, Lesion (e.g. tumour) and Normal ageing. In addition, considering the character-istics of this collection with larger thickness along the direction of depth (z) (∼3-5mm), an advanced CNN architecture is established integrating both 2D and 3D CNN networks. The fusion of the two CNN networks is subsequently coordinated based on the average of Softmax scores obtained from both networks consolidating 2D images along spatial axial directions and 3D segmented blocks respectively. As a result, the classification ac-curacy rates rendered by this elaborated CNN architecture are 85.2%, 80% and 95.3% for classes of AD, Lesion and Normal respectively with an average of 87.6%. Additionally, this improved CNN network appears to outperform the others when in comparison with 2D version only of CNN network as well as a number of state of the art hand-crafted approaches. As a result, these approaches deliver accuracy rates in percentage of 86.3, 85.6+-1:10, 86.3+-1:04, 85.2+-1:60, 83.1+-0:35 for 2D CNN, 2D SIFT, 2DKAZE, 3D SIFT and 3D KAZE respectively. The two major contributions of the paper constitute a new 3-D approach while applying deep learning technique to extract signature information rooted in both 2D slices and 3D blocks of CT images and an elaborated hand-crated approach of 3D KAZE

    IMPLEmenting a clinical practice guideline for acute low back pain evidence-based manageMENT in general practice (IMPLEMENT) : cluster randomised controlled trial study protocol

    Get PDF
    Background: Evidence generated from reliable research is not frequently implemented into clinical practice. Evidence-based clinical practice guidelines are a potential vehicle to achieve this. A recent systematic review of implementation strategies of guideline dissemination concluded that there was a lack of evidence regarding effective strategies to promote the uptake of guidelines. Recommendations from this review, and other studies, have suggested the use of interventions that are theoretically based because these may be more effective than those that are not. An evidencebased clinical practice guideline for the management of acute low back pain was recently developed in Australia. This provides an opportunity to develop and test a theory-based implementation intervention for a condition which is common, has a high burden, and for which there is an evidence-practice gap in the primary care setting. Aim: This study aims to test the effectiveness of a theory-based intervention for implementing a clinical practice guideline for acute low back pain in general practice in Victoria, Australia. Specifically, our primary objectives are to establish if the intervention is effective in reducing the percentage of patients who are referred for a plain x-ray, and improving mean level of disability for patients three months post-consultation. Methods/Design: This study protocol describes the details of a cluster randomised controlled trial. Ninety-two general practices (clusters), which include at least one consenting general practitioner, will be randomised to an intervention or control arm using restricted randomisation. Patients aged 18 years or older who visit a participating practitioner for acute non-specific low back pain of less than three months duration will be eligible for inclusion. An average of twenty-five patients per general practice will be recruited, providing a total of 2,300 patient participants. General practitioners in the control arm will receive access to the guideline using the existing dissemination strategy. Practitioners in the intervention arm will be invited to participate in facilitated face-to-face workshops that have been underpinned by behavioural theory. Investigators (not involved in the delivery of the intervention), patients, outcome assessors and the study statistician will be blinded to group allocation. Trial registration: Australian New Zealand Clinical Trials Registry ACTRN012606000098538 (date registered 14/03/2006).The trial is funded by the NHMRC by way of a Primary Health Care Project Grant (334060). JF has 50% of her time funded by the Chief Scientist Office3/2006). of the Scottish Government Health Directorate and 50% by the University of Aberdeen. PK is supported by a NHMRC Health Professional Fellowship (384366) and RB by a NHMRC Practitioner Fellowship (334010). JG holds a Canada Research Chair in Health Knowledge Transfer and Uptake. All other authors are funded by their own institutions

    Monitoring international migration flows in Europe. Towards a statistical data base combining data from different sources

    Get PDF
    The paper reviews techniques developed in demography, geography and statistics that are useful for bridging the gap between available data on international migration flows and the information required for policy making and research. The basic idea of the paper is as follows: to establish a coherent and consistent data base that contains sufficiently detailed, up-to-date and accurate information, data from several sources should be combined. That raises issues of definition and measurement, and of how to combine data from different origins properly. The issues may be tackled more easily if the statistics that are being compiled are viewed as different outcomes or manifestations of underlying stochastic processes governing migration. The link between the processes and their outcomes is described by models, the parameters of which must be estimated from the available data. That may be done within the context of socio-demographic accounting. The paper discusses the experience of the U.S. Bureau of the Census in combining migration data from several sources. It also summarizes the many efforts in Europe to establish a coherent and consistent data base on international migration. The paper was written at IIASA. It is part of the Migration Estimation Study, which is a collaborative IIASA-University of Groningen project, funded by the Netherlands Organization for Scientific Research (NWO). The project aims at developing techniques to obtain improved estimates of international migration flows by country of origin and country of destination
    corecore