426 research outputs found

    Threat modeling in smart firefighting systems: aligning MITRE ATT&CK Matrix and NIST security controls

    Get PDF
    Industrial automation technologies are envisioned as multi-device systems that are constantly interacting with one another and with enterprise systems. In these industrial systems, the industrial internet of things (IIoT) significantly improves system efficiency, scalability, ease of control, and monitoring. These benefits have been achieved at the cost of greater security risks, thus making the system vulnerable to cyberattacks. Historically, industrial networks and systems lacked security features like authentication and encryption due to intended isolation over the Internet. Lately, remote access to these IIoT systems has made an attempt of holistic security alarmingly critical. In this research paper, a threat modeling framework for smart cyber–physical system (CPS) is proposed to get insight of the potential security risks. To carry out this research, the smart firefighting use case based on the MITRE ATT&CK matrix was investigated. The matrix analysis provided structure for attacks detection and mitigation, while system requirement collection (SRC) was applied to gather generic assets’ information related to hardware, software and network. With the help of SRC and MITRE ATT&CK, a threat list for the smart firefighting system was generated. Conclusively, the generated threat list was mapped on the national institute of standards and technology (NIST) security and privacy controls. The results show that these mapped controls can be well-utilized for protection and mitigation of threats in smart firefighting system. In future, critical cyber–physical systems can be modeled upon use case specific threats and can be secured by utilizing the presented framework

    Host genetic factors associated with hepatocellular carcinoma in patients with hepatitis C virus infection: a systematic review

    Get PDF
    Hepatitis C virus (HCV)-infected patients are at risk of developing hepatocellular carcinoma (HCC). Individuals at heightened riskcould be targeted by intensive follow-up surveillance. We have conducted a systematic review of the literature to identify host genetic predisposition to HCC in HCV-infected patients. A comprehensive search of Medline and Embase databases was performed and the strength of evidence of associations for each gene on development of HCC was evaluated. We identified 166 relevant studies, relating to 137 different genes, or combinations thereof. 17 genes were classified as having “good” evidence of an association, a significant association was observed for 37 genes but this finding had not yet been replicated, 56 genes had mixed or limited evidence of an association, and 27 genes showed no association. IFNL3/4, TNF-α and PNPLA3 genes had the most evidence of an association. There was, however, considerable heterogeneity in study design and data quality. In conclusion, we identified a number of genes with evidence of association with HCC, but also a need for more standardised approaches to address this clinically critical question. It is important to consider the underlying mechanism of these relationships and which are confounded by the presence of other HCC risk factors and response to therapy. We also identified many genes where the evidence of association is contradictory or requires replication, as well as a number where associations have been studied but no evidence found. These findings should help to direct future studies on host genetic predisposition to HCC in patients with HCV infection

    Platelet kinetics after slow versus standard transfusions: A pilot study

    Get PDF
    Background. Platelet transfusion is required in the acute phase of some thrombocytopenic disorders in order to prevent potentially dangerous hemorrhages. The purpose of this study was to assess the increase in platelet count following a slow platelet transfusion. Methods. Patients suffering from thrombocytopenia due to various underlying diseases were enrolled in the prospective pilot feasibility trial and were randomly divided into two groups. Standard platelet transfusion was administered in one group, while slow transfusion was used in the other. The platelet count was examined at 1 hour, 24 hours, and 1 week following the transfusions. Results. Although the platelet count was higher following 1 hour after transfusion via the standard method, the count tended to be higher 1 week after the transfusion in the slow transfusion group. This difference, however, only turned out to be statistically significant amongst females. Conclusion. A therapy of slow platelet transfusion might be more effective for the prevention of platelet loss. Further studies will be required to strengthen this hypothesis

    Rectal Transmission of Transmitted/Founder HIV-1 Is Efficiently Prevented by Topical 1% Tenofovir in BLT Humanized Mice

    Get PDF
    Rectal microbicides are being developed to prevent new HIV infections in both men and women. We focused our in vivo preclinical efficacy study on rectally-applied tenofovir. BLT humanized mice (n = 43) were rectally inoculated with either the primary isolate HIV-1(JRCSF) or the MSM-derived transmitted/founder (T/F) virus HIV-1(THRO) within 30 minutes following treatment with topical 1% tenofovir or vehicle. Under our experimental conditions, in the absence of drug treatment we observed 50% and 60% rectal transmission by HIV-1(JRCSF) and HIV-1(THRO), respectively. Topical tenofovir reduced rectal transmission to 8% (1/12; log rank p = 0.03) for HIV-1(JRCSF) and 0% (0/6; log rank p = 0.02) for HIV-1(THRO). This is the first demonstration that any human T/F HIV-1 rectally infects humanized mice and that transmission of the T/F virus can be efficiently blocked by rectally applied 1% tenofovir. These results obtained in BLT mice, along with recent ex vivo, Phase 1 trial and non-human primate reports, provide a critically important step forward in the development of tenofovir-based rectal microbicides

    Prospective randomized comparison of open versus laparoscopic management of splenic artery aneurysms: a 10-year study

    Get PDF
    Abstract BACKGROUND: The literature does not support the choice between open and laparoscopic management of splenic artery aneurysms (SAA). METHODS: We designed a prospective, randomized comparison between open and laparoscopic surgery for SAA. Primary end points were types of surgical procedures performed and clinical outcomes. Analysis was developed on an intention-to-treat basis. RESULTS: Fourteen patients were allocated to laparotomy (group A) and 15 to laparoscopy (group B). Groups displayed similar patient- and aneurysm-related characteristics. The conversion rate to open surgery was 13.3 %. The type of surgical procedure performed on the splenic artery was similar in the two groups: aneurysmectomy with splenic artery ligature or direct anastomosis was performed in 51 % and 21 % of patients in group A and in 60 % and 20 % in group B, respectively. The splenectomy rate was similar (14 % vs. 20 %). Postoperative splenic infarction was observed in one case in each group. Laparoscopy was associated with shorter procedures (p = 0.0003) and lower morbidity (25 % vs. 64 %, p = 0.045). Major morbidity requiring interventional procedures and blood transfusion was observed only in group A. Laparoscopy was associated with quicker resumption of oral diet (p < 0.001), earlier drain removal (p = 0.046), and shorter hospital stay (p < 0.01). During a mean follow-up of 50 months, two patients in group A required hospital readmission. In group B, two patients developed a late thrombosis of arterial anastomoses. CONCLUSIONS: Our study demonstrates that laparoscopy permits multiple technical options, does not increase the splenectomy rate, and reduces postoperative complications. It confirms the supposed clinical benefits of laparoscopy when ablative procedures are required but laparoscopic anastomoses show poor long-term results

    Factors Influencing the Emergence and Spread of HIV Drug Resistance Arising from Rollout of Antiretroviral Pre-Exposure Prophylaxis (PrEP)

    Get PDF
    Background: The potential for emergence and spread of HIV drug resistance from rollout of antiretroviral (ARV) pre-exposure prophylaxis (PrEP) is an important public health concern. We investigated determinants of HIV drug resistance prevalence after PrEP implementation through mathematical modeling. Methodology: A model incorporating heterogeneity in age, gender, sexual activity, HIV infection status, stage of disease, PrEP coverage/discontinuation, and HIV drug susceptibility, was designed to simulate the impact of PrEP on HIV prevention and drug resistance in a sub-Saharan epidemic. Principal Findings: Analyses suggest that the prevalence of HIV drug resistance is influenced most by the extent and duration of inadvertent PrEP use in individuals already infected with HIV. Other key factors affecting drug resistance prevalence include the persistence time of transmitted resistance and the duration of inadvertent PrEP use in individuals who become infected on PrEP. From uncertainty analysis, the median overall prevalence of drug resistance at 10 years was predicted to be 9.2% (interquartile range 6.9%-12.2%). An optimistic scenario of 75% PrEP efficacy, 60% coverage of the susceptible population, and 5% inadvertent PrEP use predicts a rise in HIV drug resistance prevalence to only 2.5% after 10 years. By contrast, in a pessimistic scenario of 25% PrEP efficacy, 15% population coverage, and 25% inadvertent PrEP use, resistance prevalence increased to over 40%. Conclusions: Inadvertent PrEP use in previously-infected individuals is the major determinant of HIV drug resistance prevalence arising from PrEP. Both the rate and duration of inadvertent PrEP use are key factors. PrEP rollout programs should include routine monitoring of HIV infection status to limit the spread of drug resistance. © 2011 Abbas et al

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    SummaryBackground The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding Bill & Melinda Gates Foundation
    corecore