31 research outputs found

    Portable TPM based user Attestation Architecture for Cloud Environments

    Get PDF
    Cloud computing is causing a major shift in the IT industry. Research indicates that the cloud computing industry segment is substantial and growing enormously. New technologies have been developed, and now there are various ways to virtualize IT systems and to access the needed applications on the Internet, through web based applications. Users, now can access their data any time and at any place with the service provided by the cloud storage. With all these benefits, security is always a concern. Even though the cloud provides accessing the data stored in cloud storage in a flexible and scalable manner, the main challenge it faces is with the security issues. Thus user may think it2019;s not secure since the encryption keys are managed by the software, therefore there is no attestation on the client software integrity. The cloud user who has to deploy in the reliable and secure environment should be confirmed from the Infrastructure as a Service (IaaS) that it has not been corrupted by the mischievous acts. Thus, the user identification which consists user ID and password can also be easily compromised. Apart from the traditional network security solutions, trusted computing technology is combined into more and more aspects of cloud computing environment to guarantee the integrity of platform and provide attestation mechanism for trustworthy services. Thus, enhancing the confidence of the IaaS provider. A cryptographic protocol adopted by the Trusted Computing Group enables the remote authentication which preserves the privacy of the user based on the trusted platform. Thus we propose a framework which defines Trusted Platform Module (TPM), a trusted computing group which proves the secure data access control in the cloud storage by providing additional security. In this paper, we define the TPMbased key management, remote client attestation and a secure key share protocol across multiple users. Then we consider some of the challenges with the current TPM based att

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    MR fluoroscopy in vascular and cardiac interventions (review)

    Get PDF
    Vascular and cardiac disease remains a leading cause of morbidity and mortality in developed and emerging countries. Vascular and cardiac interventions require extensive fluoroscopic guidance to navigate endovascular catheters. X-ray fluoroscopy is considered the current modality for real time imaging. It provides excellent spatial and temporal resolution, but is limited by exposure of patients and staff to ionizing radiation, poor soft tissue characterization and lack of quantitative physiologic information. MR fluoroscopy has been introduced with substantial progress during the last decade. Clinical and experimental studies performed under MR fluoroscopy have indicated the suitability of this modality for: delivery of ASD closure, aortic valves, and endovascular stents (aortic, carotid, iliac, renal arteries, inferior vena cava). It aids in performing ablation, creation of hepatic shunts and local delivery of therapies. Development of more MR compatible equipment and devices will widen the applications of MR-guided procedures. At post-intervention, MR imaging aids in assessing the efficacy of therapies, success of interventions. It also provides information on vascular flow and cardiac morphology, function, perfusion and viability. MR fluoroscopy has the potential to form the basis for minimally invasive image–guided surgeries that offer improved patient management and cost effectiveness

    Population‐based cohort study of outcomes following cholecystectomy for benign gallbladder diseases

    Get PDF
    Background The aim was to describe the management of benign gallbladder disease and identify characteristics associated with all‐cause 30‐day readmissions and complications in a prospective population‐based cohort. Methods Data were collected on consecutive patients undergoing cholecystectomy in acute UK and Irish hospitals between 1 March and 1 May 2014. Potential explanatory variables influencing all‐cause 30‐day readmissions and complications were analysed by means of multilevel, multivariable logistic regression modelling using a two‐level hierarchical structure with patients (level 1) nested within hospitals (level 2). Results Data were collected on 8909 patients undergoing cholecystectomy from 167 hospitals. Some 1451 cholecystectomies (16·3 per cent) were performed as an emergency, 4165 (46·8 per cent) as elective operations, and 3293 patients (37·0 per cent) had had at least one previous emergency admission, but had surgery on a delayed basis. The readmission and complication rates at 30 days were 7·1 per cent (633 of 8909) and 10·8 per cent (962 of 8909) respectively. Both readmissions and complications were independently associated with increasing ASA fitness grade, duration of surgery, and increasing numbers of emergency admissions with gallbladder disease before cholecystectomy. No identifiable hospital characteristics were linked to readmissions and complications. Conclusion Readmissions and complications following cholecystectomy are common and associated with patient and disease characteristics

    Mechanisms and management of loss of response to anti-TNF therapy for patients with Crohn's disease: 3-year data from the prospective, multicentre PANTS cohort study

    Get PDF
    This is the final version. Available from Elsevier via the DOI in this record. Background We sought to report the effectiveness of infliximab and adalimumab over the first 3 years of treatment and to define the factors that predict anti-TNF treatment failure and the strategies that prevent or mitigate loss of response. Methods Personalised Anti-TNF therapy in Crohn’s disease (PANTS) is a UK-wide, multicentre, prospective observational cohort study reporting the rates of effectiveness of infliximab and adalimumab in anti-TNF-naive patients with active luminal Crohn’s disease aged 6 years and older. At the end of the first year, sites were invited to enrol participants still receiving study drug into the 2-year PANTS-extension study. We estimated rates of remission across the whole cohort at the end of years 1, 2, and 3 of the study using a modified survival technique with permutation testing. Multivariable regression and survival analyses were used to identify factors associated with loss of response in patients who had initially responded to anti-TNF therapy and with immunogenicity. Loss of response was defined in patients who initially responded to anti-TNF therapy at the end of induction and who subsequently developed symptomatic activity that warranted an escalation of steroid, immunomodulatory, or anti-TNF therapy, resectional surgery, or exit from study due to treatment failure. This study was registered with ClinicalTrials.gov, NCT03088449, and is now complete. Findings Between March 19, 2014, and Sept 21, 2017, 389 (41%) of 955 patients treated with infliximab and 209 (32%) of 655 treated with adalimumab in the PANTS study entered the PANTS-extension study (median age 32·5 years [IQR 22·1–46·8], 307 [51%] of 598 were female, and 291 [49%] were male). The estimated proportion of patients in remission at the end of years 1, 2, and 3 were, for infliximab 40·2% (95% CI 36·7–43·7), 34·4% (29·9–39·0), and 34·7% (29·8–39·5), and for adalimumab 35·9% (95% CI 31·2–40·5), 32·9% (26·8–39·2), and 28·9% (21·9–36·3), respectively. Optimal drug concentrations at week 14 to predict remission at any later timepoints were 6·1–10·0 mg/L for infliximab and 10·1–12·0 mg/L for adalimumab. After excluding patients who had primary non-response, the estimated proportions of patients who had loss of response by years 1, 2, and 3 were, for infliximab 34·4% (95% CI 30·4–38·2), 54·5% (49·4–59·0), and 60·0% (54·1–65·2), and for adalimumab 32·1% (26·7–37·1), 47·2% (40·2–53·4), and 68·4% (50·9–79·7), respectively. In multivariable analysis, loss of response at year 2 and 3 for patients treated with infliximab and adalimumab was predicted by low anti-TNF drug concentrations at week 14 (infliximab: hazard ratio [HR] for each ten-fold increase in drug concentration 0·45 [95% CI 0·30–0·67], adalimumab: 0·39 [0·22–0·70]). For patients treated with infliximab, loss of response was also associated with female sex (vs male sex; HR 1·47 [95% CI 1·11–1·95]), obesity (vs not obese 1·62 [1·08–2·42]), baseline white cell count (1·06 [1·02–1·11) per 1 × 10⁹ increase in cells per L), and thiopurine dose quartile. Among patients treated with adalimumab, carriage of the HLA-DQA1*05 risk variant was associated with loss of response (HR 1·95 [95% CI 1·17–3·25]). By the end of year 3, the estimated proportion of patients who developed anti-drug antibodies associated with undetectable drug concentrations was 44·0% (95% CI 38·1–49·4) among patients treated with infliximab and 20·3% (13·8–26·2) among those treated with adalimumab. The development of antidrug antibodies associated with undetectable drug concentrations was significantly associated with treatment without concomitant immunomodulator use for both groups (HR for immunomodulator use: infliximab 0·40 [95% CI 0·31–0·52], adalimumab 0·42 [95% CI 0·24–0·75]), and with carriage of HLA-DQA1*05 risk variant for infliximab (HR for carriage of risk variant: infliximab 1·46 [1·13–1·88]) but not for adalimumab (HR 1·60 [0·92–2·77]). Concomitant use of an immunomodulator before or on the day of starting infliximab was associated with increased time without the development of anti-drug antibodies associated with undetectable drug concentrations compared with use of infliximab alone (HR 2·87 [95% CI 2·20–3·74]) or introduction of an immunomodulator after anti-TNF initiation (1·70 [1·11–2·59]). In years 2 and 3, 16 (4%) of 389 patients treated with infliximab and 11 (5%) of 209 treated with adalimumab had adverse events leading to treatment withdrawal. Nine (2%) patients treated with infliximab and two (1%) of those treated with adalimumab had serious infections in years 2 and 3. Interpretation Only around a third of patients with active luminal Crohn’s disease treated with an anti-TNF drug were in remission at the end of 3 years of treatment. Low drug concentrations at the end of the induction period predict loss of response by year 3 of treatment, suggesting higher drug concentrations during the first year of treatment, particularly during induction, might lead to better long-term outcomes. Anti-drug antibodies associated with undetectable drug concentrations of infliximab, but not adalimumab, can be predicted by carriage of HLA-DQA1*05 and mitigated by concomitant immunomodulator use for both drugs.Guts UKCrohn’s and Colitis UKCure Crohn’s ColitisAbbVieMerck Sharp and DohmeNapp PharmaceuticalsPfizerCelltrion Healthcar

    Burden of disease scenarios for 204 countries and territories, 2022–2050: a forecasting analysis for the Global Burden of Disease Study 2021

    Get PDF
    Background: Future trends in disease burden and drivers of health are of great interest to policy makers and the public at large. This information can be used for policy and long-term health investment, planning, and prioritisation. We have expanded and improved upon previous forecasts produced as part of the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) and provide a reference forecast (the most likely future), and alternative scenarios assessing disease burden trajectories if selected sets of risk factors were eliminated from current levels by 2050. Methods: Using forecasts of major drivers of health such as the Socio-demographic Index (SDI; a composite measure of lag-distributed income per capita, mean years of education, and total fertility under 25 years of age) and the full set of risk factor exposures captured by GBD, we provide cause-specific forecasts of mortality, years of life lost (YLLs), years lived with disability (YLDs), and disability-adjusted life-years (DALYs) by age and sex from 2022 to 2050 for 204 countries and territories, 21 GBD regions, seven super-regions, and the world. All analyses were done at the cause-specific level so that only risk factors deemed causal by the GBD comparative risk assessment influenced future trajectories of mortality for each disease. Cause-specific mortality was modelled using mixed-effects models with SDI and time as the main covariates, and the combined impact of causal risk factors as an offset in the model. At the all-cause mortality level, we captured unexplained variation by modelling residuals with an autoregressive integrated moving average model with drift attenuation. These all-cause forecasts constrained the cause-specific forecasts at successively deeper levels of the GBD cause hierarchy using cascading mortality models, thus ensuring a robust estimate of cause-specific mortality. For non-fatal measures (eg, low back pain), incidence and prevalence were forecasted from mixed-effects models with SDI as the main covariate, and YLDs were computed from the resulting prevalence forecasts and average disability weights from GBD. Alternative future scenarios were constructed by replacing appropriate reference trajectories for risk factors with hypothetical trajectories of gradual elimination of risk factor exposure from current levels to 2050. The scenarios were constructed from various sets of risk factors: environmental risks (Safer Environment scenario), risks associated with communicable, maternal, neonatal, and nutritional diseases (CMNNs; Improved Childhood Nutrition and Vaccination scenario), risks associated with major non-communicable diseases (NCDs; Improved Behavioural and Metabolic Risks scenario), and the combined effects of these three scenarios. Using the Shared Socioeconomic Pathways climate scenarios SSP2-4.5 as reference and SSP1-1.9 as an optimistic alternative in the Safer Environment scenario, we accounted for climate change impact on health by using the most recent Intergovernmental Panel on Climate Change temperature forecasts and published trajectories of ambient air pollution for the same two scenarios. Life expectancy and healthy life expectancy were computed using standard methods. The forecasting framework includes computing the age-sex-specific future population for each location and separately for each scenario. 95% uncertainty intervals (UIs) for each individual future estimate were derived from the 2·5th and 97·5th percentiles of distributions generated from propagating 500 draws through the multistage computational pipeline. Findings: In the reference scenario forecast, global and super-regional life expectancy increased from 2022 to 2050, but improvement was at a slower pace than in the three decades preceding the COVID-19 pandemic (beginning in 2020). Gains in future life expectancy were forecasted to be greatest in super-regions with comparatively low life expectancies (such as sub-Saharan Africa) compared with super-regions with higher life expectancies (such as the high-income super-region), leading to a trend towards convergence in life expectancy across locations between now and 2050. At the super-region level, forecasted healthy life expectancy patterns were similar to those of life expectancies. Forecasts for the reference scenario found that health will improve in the coming decades, with all-cause age-standardised DALY rates decreasing in every GBD super-region. The total DALY burden measured in counts, however, will increase in every super-region, largely a function of population ageing and growth. We also forecasted that both DALY counts and age-standardised DALY rates will continue to shift from CMNNs to NCDs, with the most pronounced shifts occurring in sub-Saharan Africa (60·1% [95% UI 56·8–63·1] of DALYs were from CMNNs in 2022 compared with 35·8% [31·0–45·0] in 2050) and south Asia (31·7% [29·2–34·1] to 15·5% [13·7–17·5]). This shift is reflected in the leading global causes of DALYs, with the top four causes in 2050 being ischaemic heart disease, stroke, diabetes, and chronic obstructive pulmonary disease, compared with 2022, with ischaemic heart disease, neonatal disorders, stroke, and lower respiratory infections at the top. The global proportion of DALYs due to YLDs likewise increased from 33·8% (27·4–40·3) to 41·1% (33·9–48·1) from 2022 to 2050, demonstrating an important shift in overall disease burden towards morbidity and away from premature death. The largest shift of this kind was forecasted for sub-Saharan Africa, from 20·1% (15·6–25·3) of DALYs due to YLDs in 2022 to 35·6% (26·5–43·0) in 2050. In the assessment of alternative future scenarios, the combined effects of the scenarios (Safer Environment, Improved Childhood Nutrition and Vaccination, and Improved Behavioural and Metabolic Risks scenarios) demonstrated an important decrease in the global burden of DALYs in 2050 of 15·4% (13·5–17·5) compared with the reference scenario, with decreases across super-regions ranging from 10·4% (9·7–11·3) in the high-income super-region to 23·9% (20·7–27·3) in north Africa and the Middle East. The Safer Environment scenario had its largest decrease in sub-Saharan Africa (5·2% [3·5–6·8]), the Improved Behavioural and Metabolic Risks scenario in north Africa and the Middle East (23·2% [20·2–26·5]), and the Improved Nutrition and Vaccination scenario in sub-Saharan Africa (2·0% [–0·6 to 3·6]). Interpretation: Globally, life expectancy and age-standardised disease burden were forecasted to improve between 2022 and 2050, with the majority of the burden continuing to shift from CMNNs to NCDs. That said, continued progress on reducing the CMNN disease burden will be dependent on maintaining investment in and policy emphasis on CMNN disease prevention and treatment. Mostly due to growth and ageing of populations, the number of deaths and DALYs due to all causes combined will generally increase. By constructing alternative future scenarios wherein certain risk exposures are eliminated by 2050, we have shown that opportunities exist to substantially improve health outcomes in the future through concerted efforts to prevent exposure to well established risk factors and to expand access to key health interventions

    Identification of SNP and SSR Markers in Finger Millet Using Next Generation Sequencing Technologies

    Get PDF
    Finger millet is an important cereal crop in eastern Africa and southern India with excellent grain storage quality and unique ability to thrive in extreme environmental conditions. Since negligible attention has been paid to improving this crop to date, the current study used Next Generation Sequencing (NGS) technologies to develop both Simple Sequence Repeat (SSR) and Single Nucleotide Polymorphism (SNP) markers. Genomic DNA from cultivated finger millet genotypes KNE755 and KNE796 was sequenced using both Roche 454 and Illumina technologies. Non-organelle sequencing reads were assembled into 207 Mbp representing approximately 13% of the finger millet genome. We identified 10,327 SSRs and 23,285 non-homeologous SNPs and tested 101 of each for polymorphism across a diverse set of wild and cultivated finger millet germplasm. For the 49 polymorphic SSRs, the mean polymorphism information content (PIC) was 0.42, ranging from 0.16 to 0.77. We also validated 92 SNP markers, 80 of which were polymorphic with a mean PIC of 0.29 across 30 wild and 59 cultivated accessions. Seventy-six of the 80 SNPs were polymorphic across 30 wild germplasm with a mean PIC of 0.30 while only 22 of the SNP markers showed polymorphism among the 59 cultivated accessions with an average PIC value of 0.15. Genetic diversity analysis using the polymorphic SNP markers revealed two major clusters; one of wild and another of cultivated accessions. Detailed STRUCTURE analysis confirmed this grouping pattern and further revealed 2 sub-populations within wild E. coracana subsp. africana. Both STRUCTURE and genetic diversity analysis assisted with the correct identification of the new germplasm collections. These polymorphic SSR and SNP markers are a significant addition to the existing 82 published SSRs, especially with regard to the previously reported low polymorphism levels in finger millet. Our results also reveal an unexploited finger millet genetic resource that can be included in the regional breeding programs in order to efficiently optimize productivity

    Journal of International Academic Research for Multidisciplinary Editorial Board __________________________________________________________________________________________ ITEM-BASED COLLABORATIVE FILTERING RECOMMENDER SYSTEM

    No full text
    ABSTRACT: Recommendation algorithms are best known for their use on e-commerce Web sites, where they use input about a customer&apos;s interests to generate a list of recommended items. Many applications use only the items that customers purchase and explicitly rate to represent their interests, but they can also use other attributes, including items viewed, demographic data, subject interests, and favourite artists. Collaborative filtering is one of the most important technologies in electronic commerce. With the development of recommender systems, the magnitudes of users and items grow rapidly, resulted in the extreme sparsity of user rating data set. Traditional similarity measure methods work poor in this situation, make the quality of recommendation system decreased dramatically. Poor quality is one major challenge in collaborative filtering recommender systems. Sparsity of users&apos; ratings is the major reason causing the poor quality. To address these issues we have explored item-based collaborative filtering techniques. Item based techniques first analyze the user-item matrix to identify relationships between different items, and then use these relationships to indirectly compute recommendations for users. This approach predicts item ratings that users have not rated, and then uses Pearson correlation similarity measurement to find the target items&apos; neighbors, lastly produces the recommendations. The experiments are made on a common data set using different recommender algorithms. The results show that the proposed approach can improve the accuracy of the collaborative filtering recommender system
    corecore