154,794 research outputs found

    Friends - of a kind: America and its allies in the Second World War

    Get PDF
    Copyright @ 2006 Cambridge University PressThe Second World War continues to be an attractive subject for scholars and evenmore so for those writing for a general readership. One of the more traditional areas of focus has been the ‘Big Three’ – the alliance of the United States with Britain and the Soviet Union. Public interest in the three leaders – Churchill, Roosevelt and Stalin – remains high, and their decisions continue to resonate in the post-Cold War era, as demonstrated by continued (and often ahistorical) references to the decisions made at the Yalta Conference. Consequently, while other aspects of Second World War historiography have pushed into new avenues of exploration, that which has looked at the Grand Alliance has followed fairly conventional lines – the new Soviet bloc materials have been trawled to answer old questions and using the frames of reference that developed during the Cold War. This has left much to be said about the nature of the relationship of the United States with its great allies and the dynamics and processes of that alliance, and overlooked full and rounded analysis of the role of that alliance as the instrument of Axis defeat

    Monro-Kellie 2.0: The dynamic vascular and venous pathophysiological components of intracranial pressure

    Get PDF
    For 200 years, the ‘closed box’ analogy of intracranial pressure (ICP) has underpinned neurosurgery and neuro-critical care. Cushing conceptualised the Monro-Kellie doctrine stating that a change in blood, brain or CSF volume resulted in reciprocal changes in one or both of the other two. When not possible, attempts to increase a volume further increase ICP. On this doctrine’s “truth or relative untruth” depends many of the critical procedures in the surgery of the central nervous system. However, each volume component may not deserve the equal weighting this static concept implies. The slow production of CSF (0.35 ml/min) is dwarfed by the dynamic blood in and outflow (∼700 ml/min). Neuro-critical care practice focusing on arterial and ICP regulation has been questioned. Failure of venous efferent flow to precisely match arterial afferent flow will yield immediate and dramatic changes in intracranial blood volume and pressure. Interpreting ICP without interrogating its core drivers may be misleading. Multiple clinical conditions and the cerebral effects of altitude and microgravity relate to imbalances in this dynamic rather than ICP per se. This article reviews the Monro-Kellie doctrine, categorises venous outflow limitation conditions, relates physiological mechanisms to clinical conditions and suggests specific management options

    NaNog: A pluripotency homeobox (master) molecule.

    Get PDF
    One of the most intriguing aspects of cell biology is the state of pluripotency, where the cell is capable of self-renewal for as many times as deemed necessary , then at a specified time can differentiate into any type of cell. This fundamental process is required during organogenesis in foetal life and importantly during tissue repair in health and disease. Pluripotency is very tightly regulated, as any dysregulation can result in congenital defects, inability to repair damage, or cancer. Fuelled by the relatively recent interest in stem cell biology and tissue regeneration, the molecules implicated in regulating pluripotency have been the subject of extensive research. One of the important molecules involved in pluripotency, is NaNog, the subject of this article

    Enhancement of platelet response to immune complexes and IgG aggregates by lipid A-rich bacterial lipopolysaccharides.

    Get PDF
    The effect of the common lipid moiety of bacterial LPS on secretion from washed human platelets has been studied. The lipid A-rich LPS of S. minnesota R595 and a lipid A preparation both potentiated platelet serotonin secretion in response to IgG aggregates or immune complexes up to 50-fold but had little effect in the absence of IgG. Lipid A has been shown to bind immune aggregates, raising the possibility that its mechanism of action involved effective enlargement or insolubilization of the aggregates. IgG aggregates of dimer to tetramer size were shown to be platelet simuli, equivalent on a weight basis to larger soluble aggregates. The effect of both sizes of aggregates on platelets were equally enhanced by the LPS, indicating that increased size of aggregates alone could not account for the effect of LPS. Similarly, because lipid A-rich LPS enhanced platelet response to already insoluble immune complexes, its mechanism of action cannot simply be insolubilization of immune aggregates. These LPS did not enhance platelet stimulation by antiplatelet antibody, monosodium urate crystals, or thrombin and only slightly enhanced stimulation by insoluble human skin collagen. This indicates some stimulus specificity in the ability of LPS to increase platelet secretion. The enhancement of cell response to immune complexes by the common lipid region of LPS may represent a mechanism for the diverse effects of LPS in vivo and in vitro

    An empirical investigation of an object-oriented software system

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below.This paper describes an empirical investigation into an industrial object-oriented (OO) system comprised of 133,000 lines of C++. The system was a subsystem of a telecommunications product and was developed using the Shlaer-Mellor method. From this study, we found that there was little use of OO constructs such as inheritance and, therefore, polymorphism. It was also found that there was a significant difference in the defect densities between those classes that participated in inheritance structures and those that did not, with the former being approximately three times more defect-prone. We were able to construct useful prediction systems for size and number of defects based upon simple counts such as the number of states and events per class. Although these prediction systems are only likely to have local significance, there is a more general principle that software developers can consider building their own local prediction systems. Moreover, we believe this is possible, even in the absence of the suites of metrics that have been advocated by researchers into OO technology. As a consequence, measurement technology may be accessible to a wider group of potential users

    CADUCEUS, SCIPIO, ALCADIA: Cell therapy trials using cardiac-derived cells for patients with post myocardial infarction LV dysfunction, still evolving.

    Get PDF
    The early results of the CArdiosphere-Derived aUtologous stem CElls to reverse ventricUlar dySfunction study were recently published in the Lancet [1]. This study is a phase 1 prospective randomised study, performed at two centres. The study was designed to test the hypothesis that intracoronary infusion of autologous cardiac-derived cells following myocardial infarction can reduce the size of the infarct and increase the amount of viable myocardium. The eligible patients were randomised in a 2:1 ratio to receive CDCs or standard care. In all, 17 patients were randomised to cell therapy and 8 to standard care. The cell therapy consisted of an infusion of 25 million cells into the infarct related artery, 1.5–3 months after successful primary angioplasty in patients who developed LV dysfunction (EF less than 37 per cent). The cells were derived from RV endomyocardial biopsies performed within the previous 37 days. The number of cells was determined from previous experimental studies of the maximum number of cells which can be injected without inducing infarction. The study was not blinded because of ethical considerations regarding performing right ventricular biopsy on the controls. The exclusion criteria included patients who had evidence of right ventricular infarction, or could not have an MRI examination because of claustrophobia or prior insertion of devices. There was no death, myocardial infarction or serious arrhythmia reported in either group during the period of follow up, which was between 6-12 months. Serious adverse events were observed in 24 percent of the intervention group versus 12 per cent in the controls (p not significant)

    A new unsupervised feature selection method for text clustering based on genetic algorithms

    Get PDF
    Nowadays a vast amount of textual information is collected and stored in various databases around the world, including the Internet as the largest database of all. This rapidly increasing growth of published text means that even the most avid reader cannot hope to keep up with all the reading in a field and consequently the nuggets of insight or new knowledge are at risk of languishing undiscovered in the literature. Text mining offers a solution to this problem by replacing or supplementing the human reader with automatic systems undeterred by the text explosion. It involves analyzing a large collection of documents to discover previously unknown information. Text clustering is one of the most important areas in text mining, which includes text preprocessing, dimension reduction by selecting some terms (features) and finally clustering using selected terms. Feature selection appears to be the most important step in the process. Conventional unsupervised feature selection methods define a measure of the discriminating power of terms to select proper terms from corpus. However up to now the valuation of terms in groups has not been investigated in reported works. In this paper a new and robust unsupervised feature selection approach is proposed that evaluates terms in groups. In addition a new Modified Term Variance measuring method is proposed for evaluating groups of terms. Furthermore a genetic based algorithm is designed and implemented for finding the most valuable groups of terms based on the new measure. These terms then will be utilized to generate the final feature vector for the clustering process . In order to evaluate and justify our approach the proposed method and also a conventional term variance method are implemented and tested using corpus collection Reuters-21578. For a more accurate comparison, methods have been tested on three corpuses and for each corpus clustering task has been done ten times and results are averaged. Results of comparing these two methods are very promising and show that our method produces better average accuracy and F1-measure than the conventional term variance method

    NGS Based Haplotype Assembly Using Matrix Completion

    Full text link
    We apply matrix completion methods for haplotype assembly from NGS reads to develop the new HapSVT, HapNuc, and HapOPT algorithms. This is performed by applying a mathematical model to convert the reads to an incomplete matrix and estimating unknown components. This process is followed by quantizing and decoding the completed matrix in order to estimate haplotypes. These algorithms are compared to the state-of-the-art algorithms using simulated data as well as the real fosmid data. It is shown that the SNP missing rate and the haplotype block length of the proposed HapOPT are better than those of HapCUT2 with comparable accuracy in terms of reconstruction rate and switch error rate. A program implementing the proposed algorithms in MATLAB is freely available at https://github.com/smajidian/HapMC

    Predicting with sparse data

    Get PDF
    It is well known that effective prediction of project cost related factors is an important aspect of software engineering. Unfortunately, despite extensive research over more than 30 years, this remains a significant problem for many practitioners. A major obstacle is the absence of reliable and systematic historic data, yet this is a sine qua non for almost all proposed methods: statistical, machine learning or calibration of existing models. In this paper we describe our sparse data method (SDM) based upon a pairwise comparison technique and Saaty's Analytic Hierarchy Process (AHP). Our minimum data requirement is a single known point. The technique is supported by a software tool known as DataSalvage. We show, for data from two companies, how our approach — based upon expert judgement — adds value to expert judgement by producing significantly more accurate and less biased results. A sensitivity analysis shows that our approach is robust to pairwise comparison errors. We then describe the results of a small usability trial with a practising project manager. From this empirical work we conclude that the technique is promising and may help overcome some of the present barriers to effective project prediction
    corecore