18 research outputs found

    Physical-Layer Security: Wide-band Communications & Role of Known Interference

    Get PDF
    Data security is of such paramount importance that security measures have been implemented across all layers of a communication network. One layer at which security has not been fully developed and studied is the physical layer, the lowest layer of the protocol stack. Towards establishing fundamental limits of secure communications at the physical layer, we address in this dissertation two main problems. First, we study secure communication in the wide-band regime, and second we study the role of known interference in secure communication. The concept of channel capacity per unit cost was introduced by Verdu´ in 1990 to study the limits of cost-efficient wide-band communication. It was shown that orthogonal signaling can achieve the channel capacity per unit cost of memoryless stationary channels with a zero-cost input letter. The first part of this dissertation introduces the concept of secrecy capacity per unit cost to study cost-efficient wide- band secrecy communication. For degraded memoryless stationary wiretap channels, it is shown that an orthogonal coding scheme with randomized pulse position and constant pulse shape achieves the secrecy capacity per unit cost with a zero-cost input letter. For general memoryless stationary wiretap channels, the performance of orthogonal codes is studied, and the benefit of further randomizing the pulse shape is demonstrated via a simple example. Furthermore, the problem of secure communication in a MIMO setting is considered, and a single-letter expression for the secrecy capacity per unit cost is obtained for the MIMO wiretap channel. Recently there has been a lot of success in using the deterministic approach to provide approximate characterization of Gaussian network capacity. The second part of this dissertation takes a deterministic view and revisits the problem of wiretap channel with side information. A precise characterization of the secrecy capacity is obtained for a linear deterministic model, which naturally suggests a coding scheme which we show to achieve the secrecy capacity of the degraded Gaussian model (dubbed as “secret writing on dirty paper”) to within half a bit. The success of this approach allowed its application to the problem of “secret key agreement via dirty paper coding”, where also a suggested coding scheme achieves the secret-key capacity to within half a bit

    Educational Intervention Improves Proton Pump Inhibitor Stewardship in Outpatient Gastroenterology Clinics

    Get PDF
    Background Improper chronic proton pump inhibitor (PPI) use has risen significantly in the last few decades. In our gastroenterology trainees’ clinics, we aimed to optimize PPI usage. Methods We collected baseline data on patients’ PPI use for 8 weeks. Based on gastroenterology society guidelines, we determined conditions for appropriate PPI use. If the indication could not be determined, it was categorized as “unknown”. Generated from the three most frequent causes for inappropriate PPI use, interventions were developed to correct each issue. Following a brief educational session, trainees implemented these interventions over a subsequent 8-week interval. Results During our pre-intervention period, trainees evaluated 263 patients who were prescribed a PPI. In 49% of the cases, the use of PPI was deemed inappropriate. The most common reasons were: gastroesophageal reflux disease (GERD) which was never titrated to the lowest effective dose, twice daily dosing for Barrett’s esophagus (BE) chemoprevention and unknown indication. During our intervention period, trainees evaluated 145 patients prescribed a PPI for GERD with well-controlled symptoms in 101 cases. PPI had not been titrated to lowest effective dose in 37 cases prompting intervention which was successful in 23 cases. PPI indication was unknown in 17 cases prompting a message to the prescribing provider to review appropriateness. Two cases of BE chemoprevention with twice daily dosing were appropriately reduced to daily dosing. Ultimately, after intervention, PPI use was deemed appropriate after intervention in 172 (77%) cases. Conclusions Improper chronic PPI use was significant. Focusing intervention efforts on PPI use for GERD, BE and unknown indications substantially increased appropriateness of PPI use

    Should We Measure Adenoma Detection Rate for Gastroenterology Fellows in Training?

    Get PDF
    Background: Adenoma detection rate (ADR) is a proven quality metric for colonoscopy. The value of ADR for the evaluation of gastroenterology fellows is not well established. The aim of this study is to calculate and evaluate the utility of ADR as a measure of competency for gastroenterology fellows. Methods: Colonoscopies for the purposes of screening and surveillance, on which gastroenterology fellows participated at the Richard L. Roudebush VAMC (one of the primary training sites at Indiana University), during a 9-month period, were included. ADR, cecal intubation rate, and indirect withdrawal time were measured. These metrics were compared between the levels of training. Results: A total of 591 screening and surveillance colonoscopies were performed by 14 fellows. This included six, four and four fellows, in the first, second and third year of clinical training, respectively. Fellows were on rotation at the VAMC for a mean of 1.9 months (range 1 to 3 months) during the study period. The average ADR was 68.8% (95% CI 65.37 - 72.24). The average withdrawal time was 27.59 min (95% CI 23.45 - 31.73). The average cecal intubation rate was 99% (95% CI 98-100%). There was no significant difference between ADRs, cecal intubation rates, and withdrawal times at different levels of training; however, a trend toward swifter withdrawal times with advancing training was noted. Conclusions: ADR appears not to be a useful measure of competency for gastroenterology fellows. Consideration should be given to alternative metrics that could avoid bias and confounders

    Defining adenoma detection rate benchmarks in average-risk male veterans

    Get PDF
    Background and Aims Veterans have higher prevalence of colorectal neoplasia than non-veterans; however, it is not known whether specific Veterans Affairs (VA) adenoma detection rate (ADR) benchmarks are required. We compared ADRs of a group of endoscopists for colonoscopies performed at a VA to their ADRs at a non-VA academic medical center. Methods This was a retrospective review of screening colonoscopies performed by endoscopists who practice at the Indianapolis VA and Indiana University (IU). Patients were average-risk males aged 50 years or older. ADR, proximal adenoma detection rate, advanced adenoma detection rate, and adenomas per colonoscopy were compared between IU and the VA groups. Results Six endoscopists performed screening colonoscopies at both locations during the study period (470 at IU vs 608 at the VA). The overall ADR was not significantly different between IU and the VA (58% vs 61%; p =0.21). Advanced neoplasia detection rate (13% vs 17%; p=0.46), proximal adenoma detection rate (46% vs 47%; p=0.31), and adenoma per colonoscopy (1.59 vs 1.84; p=0.24) were not significantly different. There were no significant differences in cecal intubation rate (100% vs 99%; p=0.13) or withdrawal time (10.9 vs 11.1 min; p=0.28). In regression analysis, there was significant correlation between the attending-specific ADRs at IU and the VA (p=0.041, r-square=0.69). Conclusions In this study of average-risk males undergoing screening colonoscopies by the same group of endoscopists, the ADRs of VA and non-VA colonoscopies were not significantly different. This suggests that a VA-specific ADR target is not required for endoscopists with high ADR

    Improving Colorectal Cancer Screening Rates in Patients Referred to a Gastroenterology Clinic

    Get PDF
    Colorectal cancer (CRC) is the third most common cancer and the second leading cause of cancer-related death in the United States. Colonoscopy and fecal immunochemistry testing (FIT) are the primary recommended CRC screening modalities. The purpose of this study is to improve rates of CRC screening in Veterans and County hospital patients referred to gastroenterology fellow's clinics. A total of 717 patients between ages of 49 and 75 years were seen. Previous CRC screening was not performed in 109 patients (15.2%) because of not being offered (73.4%) or declining (26.6%) screening. Patients who received previous CRC screening compared with no previous screening were older (mean age 62.3 years vs. 60.3 years, p < .003), white (88.6% vs. 78.3%, p < .027), and more likely to be Veterans patients (90.8% vs. 77.5%, p < .001). After systematically discussing options for screening with 78 of the 109 unscreened patients, 56 of them (71.8%) underwent screening with either colonoscopy (32) or FIT (24). Patients seen by fellows in their last year of training agreed to undergo screening more often than those seen by other fellows (100% vs. 66.2%, p < .033). Systematic discussions about both colonoscopy and FIT can improve the overall rates of CRC screening

    Arabidopsis thaliana cells: a model to evaluate the virulence of Pectobacterium carotovorum.

    Get PDF
    Pectobacterium carotovorum are economically important plant pathogens that cause plant soft rot. These enterobacteria display high diversity world-wide. Their pathogenesis depends on production and secretion of virulence factors such as plant cell wall-degrading enzymes, type III effectors, a necrosis-inducing protein, and a secreted virulence factor from Xanthomonas spp., which are tightly regulated by quorum sensing. Pectobacterium carotovorum also present pathogen-associated molecular patterns that could participate in their pathogenicity. In this study, by using suspension cells of Arabidopsis thaliana, we correlate plant cell death and pectate lyase activities during coinfection with different P. carotovorum strains. When comparing soft rot symptoms induced on potato slices with pectate lyase activities and plant cell death observed during coculture with Arabidopsis thaliana cells, the order of strain virulence was found to be the same. Therefore, Arabidopsis thaliana cells could be an alternative tool to evaluate rapidly and efficiently the virulence of different P. carotovorum strains

    Fecal Microbiota Transplant Decreases Mortality in Patients with Refractory Severe or Fulminant Clostridioides difficile Infection

    Get PDF
    Background & Aims Fecal microbiota transplantation (FMT) is recommended for recurrent Clostridioides difficile infection (CDI). FMT cures nearly 80% of patients with severe or fulminant CDI (SFCDI) when utilized in a sequential manner. We compared outcomes of hospitalized patients before and after implementation of an FMT program for SFCDI and investigated whether the changes could be directly attributed to the FMT program. Methods We performed a retrospective analysis of characteristics and outcomes of patients hospitalized for SFCDI (430 hospitalizations) at a single center, from January 2009 through December 2016. We performed subgroup analyses of 199 patients with fulminant CDI and 110 patients with refractory SFCDI (no improvement after 5 or more days of maximal anti-CDI antibiotic therapy). We compared CDI-related mortality within 30 days of hospitalization, CDI-related colectomy, length of hospital stay, and readmission to the hospital within 30 days before (2009–2012) vs after (2013–2016) implementation of the inpatient FMT program. Results CDI-related mortality and colectomy were lower after implementation of the FMT program. Overall, CDI-related mortality was 10.2% before the FMT program was implemented vs 4.4% after (P = .02). For patients with fulminant CDI, CDI-related mortality was 21.3% before the FMT program was implemented vs 9.1% after (P = .015). For patients with refractory SFCDI, CDI-related mortality was 43.2% before the FMT program vs 12.1% after (P < .001). The FMT program significantly reduced CDI-related colectomy in patients with SFCDI (6.8% before vs 2.7% after; P = .041), in patients with fulminant CDI (15.7% before vs 5.5% after; P = .017), and patients with refractory SFCDI (31.8% vs 7.6%; P = .001). The effect of FMT program implementation on CDI-related mortality remained significant for patients with refractory SFCDI after we accounted for the underlying secular trend (odds ratio, 0.09 for level change; P = .023). Conclusions An FMT program significantly decreased CDI-related mortality among patients hospitalized with refractory SFCDI
    corecore