67 research outputs found

    Imaging Evaluation of Complications of Hip Arthroplasty: Review of Current Concepts and Imaging Findings

    Get PDF
    AbstractTotal hip arthroplasty has evolved along with improvements in component materials and design. The radiologist must accurately diagnose associated complications with imaging methods and stay informed about newer complications associated with innovations in surgical technique, prosthetic design, and novel materials. This pictorial essay presents clinical and imaging correlation of modern hip arthroplasty complications, with an emphasis on the most common complications of instability, aseptic loosening, and infection as well as those complications associated with contemporary metal-on-metal arthroplasty

    Supervised classification for object identification in urban areas using satellite imagery

    Full text link
    This paper presents a useful method to achieve classification in satellite imagery. The approach is based on pixel level study employing various features such as correlation, homogeneity, energy and contrast. In this study gray-scale images are used for training the classification model. For supervised classification, two classification techniques are employed namely the Support Vector Machine (SVM) and the Naive Bayes. With textural features used for gray-scale images, Naive Bayes performs better with an overall accuracy of 76% compared to 68% achieved by SVM. The computational time is evaluated while performing the experiment with two different window sizes i.e., 50x50 and 70x70. The required computational time on a single image is found to be 27 seconds for a window size of 70x70 and 45 seconds for a window size of 50x50.Comment: 2018 International Conference on Computing, Mathematics and Engineering Technologies (iCoMET

    Identifying cyber risk hotspots: A framework for measuring temporal variance in computer network risk

    Get PDF
    Modern computer networks generate significant volume of behavioural system logs on a daily basis. Such networks comprise many computers with Internet connectivity, and many users who access the Web and utilise Cloud services make use of numerous devices connected to the network on an ad-hoc basis. Measuring the risk of cyber attacks and identifying the most recent modus-operandi of cyber criminals on large computer networks can be difficult due to the wide range of services and applications running within the network, the multiple vulnerabilities associated with each application, the severity associated with each vulnerability, and the ever-changing attack vector of cyber criminals. In this paper we propose a framework to represent these features, enabling real-time network enumeration and traffic analysis to be carried out, in order to produce quantified measures of risk at specific points in time. We validate the approach using data from a University network, with a data collection consisting of 462,787 instances representing threats measured over a 144 hour period. Our analysis can be generalised to a variety of other contexts

    Assessing data breach risk in cloud systems

    Get PDF
    The emerging cloud market introduces a multitude of cloud service providers, making it difficult for consumers to select providers who are likely to be a low risk from a security perspective. Recently, significant emphasis has arisen on the need to specify Service Level Agreements that address security concerns of consumers (referred to as SecSLAs) - these are intended to clarify security support in addition to Quality of Service characteristics associated with services. It has been found that such SecSLAs are not consistent among providers, even though they offer services with similar functionality. However, measuring security service levels and the associated risk plays an important role when choosing a cloud provider. Data breaches have been identified as a high priority threat influencing the adoption of cloud computing. This paper proposes a general analysis framework which can compute risk associated with data breaches based on pre-agreed SecSLAs for different cloud providers. The framework exploits a tree based structure to identify possible attack scenarios that can lead to data breaches in the cloud and a means of assessing the use of potential mitigation strategies to reduce such breaches

    Technical Challenges in the Clinical Application of Radiomics.

    Get PDF
    Radiomics is a quantitative approach to medical image analysis targeted at deciphering the morphologic and functional features of a lesion. Radiomic methods can be applied across various malignant conditions to identify tumor phenotype characteristics in the images that correlate with their likelihood of survival, as well as their association with the underlying biology. Identifying this set of characteristic features, called tumor signature, holds tremendous value in predicting the behavior and progression of cancer, which in turn has the potential to predict its response to various therapeutic options. We discuss the technical challenges encountered in the application of radiomics, in terms of methodology, workflow integration, and user experience, that need to be addressed to harness its true potential

    Translational Radiomics: Defining the Strategy Pipeline and Considerations for Application-Part 1: From Methodology to Clinical Implementation.

    Get PDF
    Enterprise imaging has channeled various technological innovations to the field of clinical radiology, ranging from advanced imaging equipment and postacquisition iterative reconstruction tools to image analysis and computer-aided detection tools. More recently, the advancements in the field of quantitative image analysis coupled with machine learning-based data analytics, classification, and integration have ushered us into the era of radiomics, which has tremendous potential in clinical decision support as well as drug discovery. There are important issues to consider to incorporate radiomics as a clinically applicable system and a commercially viable solution. In this two-part series, we offer insights into the development of the translational pipeline for radiomics from methodology to clinical implementation (Part 1) and from that to enterprise development (Part 2)

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care
    corecore