1,986 research outputs found

    War Complexity and Outcomes, 1946-2002

    Get PDF
    This study on war crises offers an operational index of complexity and spells out four postulates relating issue and structure elements to war outcomes. We expect that wars over territorial issues will end in an accommodative manner (postulate 1), that ethnic wars, though rare, will end in a non-accommodative outcome (postulate 2) and that clash of civilization issues, more than all other issues, will end in a non-accommodative way (postulate 3). Finally, wars with overall low complexity will end in accommodation while high complexity wars will not (postulate 4). Using ICB data, this study of 55 war crises, from 1946 to 2002, compares two situations: Intra-War Crisis (IWC), namely, long ongoing wars that are waged before the crisis begins (17 cases) and regular wars that occur after the crisis starts (38 cases). Findings from the study indicate that not all wars are alike. The substance of issues involved in the confrontation matters and complexity affects accommodation. Overall complexity is coupled, in part, with outcomes and as anticipated, patterns of regular wars and IWCs vary. These findings on war diversity highlight the need for a comprehensive 'multi path' model to war

    Development, Implementation and Pre-clinical Evaluation of Medical Image Computing Tools in Support of Computer-aided Diagnosis: Respiratory, Orthopedic and Cardiac Applications

    Get PDF
    Over the last decade, image processing tools have become crucial components of all clinical and research efforts involving medical imaging and associated applications. The imaging data available to the radiologists continue to increase their workload, raising the need for efficient identification and visualization of the required image data necessary for clinical assessment. Computer-aided diagnosis (CAD) in medical imaging has evolved in response to the need for techniques that can assist the radiologists to increase throughput while reducing human error and bias without compromising the outcome of the screening, diagnosis or disease assessment. More intelligent, but simple, consistent and less time-consuming methods will become more widespread, reducing user variability, while also revealing information in a more clear, visual way. Several routine image processing approaches, including localization, segmentation, registration, and fusion, are critical for enhancing and enabling the development of CAD techniques. However, changes in clinical workflow require significant adjustments and re-training and, despite the efforts of the academic research community to develop state-of-the-art algorithms and high-performance techniques, their footprint often hampers their clinical use. Currently, the main challenge seems to not be the lack of tools and techniques for medical image processing, analysis, and computing, but rather the lack of clinically feasible solutions that leverage the already developed and existing tools and techniques, as well as a demonstration of the potential clinical impact of such tools. Recently, more and more efforts have been dedicated to devising new algorithms for localization, segmentation or registration, while their potential and much intended clinical use and their actual utility is dwarfed by the scientific, algorithmic and developmental novelty that only result in incremental improvements over already algorithms. In this thesis, we propose and demonstrate the implementation and evaluation of several different methodological guidelines that ensure the development of image processing tools --- localization, segmentation and registration --- and illustrate their use across several medical imaging modalities --- X-ray, computed tomography, ultrasound and magnetic resonance imaging --- and several clinical applications: Lung CT image registration in support for assessment of pulmonary nodule growth rate and disease progression from thoracic CT images. Automated reconstruction of standing X-ray panoramas from multi-sector X-ray images for assessment of long limb mechanical axis and knee misalignment. Left and right ventricle localization, segmentation, reconstruction, ejection fraction measurement from cine cardiac MRI or multi-plane trans-esophageal ultrasound images for cardiac function assessment. When devising and evaluating our developed tools, we use clinical patient data to illustrate the inherent clinical challenges associated with highly variable imaging data that need to be addressed before potential pre-clinical validation and implementation. In an effort to provide plausible solutions to the selected applications, the proposed methodological guidelines ensure the development of image processing tools that help achieve sufficiently reliable solutions that not only have the potential to address the clinical needs, but are sufficiently streamlined to be potentially translated into eventual clinical tools provided proper implementation. G1: Reducing the number of degrees of freedom (DOF) of the designed tool, with a plausible example being avoiding the use of inefficient non-rigid image registration methods. This guideline addresses the risk of artificial deformation during registration and it clearly aims at reducing complexity and the number of degrees of freedom. G2: The use of shape-based features to most efficiently represent the image content, either by using edges instead of or in addition to intensities and motion, where useful. Edges capture the most useful information in the image and can be used to identify the most important image features. As a result, this guideline ensures a more robust performance when key image information is missing. G3: Efficient method of implementation. This guideline focuses on efficiency in terms of the minimum number of steps required and avoiding the recalculation of terms that only need to be calculated once in an iterative process. An efficient implementation leads to reduced computational effort and improved performance. G4: Commence the workflow by establishing an optimized initialization and gradually converge toward the final acceptable result. This guideline aims to ensure reasonable outcomes in consistent ways and it avoids convergence to local minima, while gradually ensuring convergence to the global minimum solution. These guidelines lead to the development of interactive, semi-automated or fully-automated approaches that still enable the clinicians to perform final refinements, while they reduce the overall inter- and intra-observer variability, reduce ambiguity, increase accuracy and precision, and have the potential to yield mechanisms that will aid with providing an overall more consistent diagnosis in a timely fashion

    Seismic radiation from regions sustaining material damage

    Get PDF
    We discuss analytical results for seismic radiation during rapid episodes of inelastic brittle deformation that include, in addition to the standard moment term, a damage-related term stemming from changes of elastic moduli in the source region. The radiation from the damage-related term is associated with products of the changes of elastic moduli and the total elastic strain components in the source region. Order of magnitude estimates suggest that the damage-related contribution to the motion in the surrounding elastic solid, which is neglected in standard calculations, can have appreciable amplitude that may in some cases be comparable to or larger than the moment contribution. A decomposition analysis shows that the damage-related source term has an isotropic component that can be larger than its double-couple component

    High-resolution imaging of the Bear Valley section of the San Andreas fault at seismogenic depths with fault-zone head waves and relocated seismicity

    Get PDF
    Author Posting. © Blackwell, 2005. This article is posted here by permission of Blackwell for personal use, not for redistribution. The definitive version was published in Geophysical Journal International 163 (2005): 152–164, doi:10.1111/j.1365-246X.2005.02703.x.Detailed imaging of fault-zone (FZ) material properties at seismogenic depths is a difficult seismological problem owing to the short length scales of the structural features. Seismic energy trapped within a low-velocity damage zone has been utilized to image the fault core at shallow depths, but these phases appear to lack sensitivity to structure in the depth range where earthquakes nucleate. Major faults that juxtapose rocks of significantly different elastic properties generate a related phase termed a fault-zone head wave (FZHW) that spends the majority of its path refracting along the fault. We utilize data from a dense temporary array of seismometers in the Bear Valley region of the San Andreas Fault to demonstrate that head waves have sensitivity to FZ structure throughout the seismogenic zone. Measured differential arrival times between the head waves and direct P arrivals and waveform modelling of these phases provide high-resolution information on the velocity contrast across the fault. The obtained values document along-strike, fault-normal, and downdip variations in the strength of the velocity contrast, ranging from 20 to 50 per cent depending on the regions being averaged by the ray paths. The complexity of the FZ waveforms increases dramatically in a region of the fault that has two active strands producing two separate bands of seismicity. Synthetic waveform calculations indicate that geological observations of the thickness and rock-type of the layer between the two strands are valid also for the subsurface structure of the fault. The results show that joint analysis of FZHWs and direct P arrivals can resolve important small-scale elements of the FZ structure at seismogenic depths. Detailed characterization of material contrasts across faults and their relation to earthquake ruptures is necessary for evaluating theoretical predictions of the effects that these structures have on rupture propagation.JM was supported by the Hoch Fund for innovative research

    Growth, Greening, and Phytochrome in Etiolated Spirodela

    Full text link

    Alzheimer's disease research: past approaches and future directions

    Get PDF
    Background: Three decades after the amyloid cascade hypothesis was first proposed, research into discovery of effective treatments for Alzheimer's disease has not yet produced any disease-modifying treatments.  Aims: This review outlines the progress made by dementia research thus far, and provides a brief overview of the therapeutic approaches resulting from the amyloid cascade hypothesis. It then describes the shift in research focus to the early stages of the condition, the challenges it presents and potential consequences for care.  Methods: A literature overview was undertaken by reviewing research papers, published protocols and policy guidelines.  Findings: Past research has failed to produce effective treatments for dementia, yet the causes of this failure remain debated. Discovery of affordable, early biomarkers has emerged as a key target of investigation as the focus has shifted from treatment to prevention of the condition.  Conclusions: Failures in identifying effective treatments for dementia have highlighted the importance of earlyidentification and intervention in patients as a way to prevent neurodegeneration and progression to dementia. Discovery of biomarkers is a key focus of current research. In the future, regular screening for dementia may be recommended for all older people in an effort to assess individual risk. Care may reflect a combination of early pharmacological interventions and lifestyle modification programmesbased on risk
    corecore