218 research outputs found

    Over 1200 drugs-related deaths and 190,000 opiate-user-years of follow-up : relative risks by sex and age-group

    Get PDF
    Heroin users/injectors' risk of drugs-related death by sex and current age is weakly estimated both in individual cohorts of under 1000 clients, 5000 person-years or 50 drugs-related deaths and when using cross-sectional data. A workshop in Cambridge analysed six cohorts who were recruited according to a common European Monitoring Centre for Drugs and Drug Addiction (EMCDDA) protocol from drug treatment agencies in Barcelona, Denmark, Dublin, Lisbon, Rome and Vienna in the 1990s; and, as external reference, opiate-user arrestees in France and hepatitis C diagnosed ever-injectors in Scotland in 1993-2001, both followed by database linkage to December 2001. EMCDDA cohorts recorded approximately equal numbers of drugs-related deaths (864) and deaths from other non-HIV causes (865) during 106,152 person-years of follow-up. External cohorts contributed 376 drugs-related deaths (Scotland 195, France 181) and 418 deaths from non-HIV causes (Scotland 221, France 197) during 86,417 person-years of follow-up (Scotland 22,670, France 63,747). EMCDDA cohorts reported 707 drugs-related deaths in 81,367 man-years {8.7 per 1000 person-years, 95% CI: 8.1 to 9.4} but only 157 in 24,785 person-years for females {6.3 per 1000 person-years, 95% CI: 5.4 to 7.4}. Except in external cohorts, relative risks by current age-group were not particularly strong, and more modest in Poisson regression than in cross-sectional analyses: relative risk was 1.2 (95% CI: 1.0-1.4) for 35-44 year olds compared to 15-24 year 3 olds, but 1.4 for males (95%CI: 1.2-1.6), and dramatically lower at 0.44 after the first year of follow-up (95% CI: 0.37-0.52)

    Default Risk and Equity Returns: A Comparison of the Bank-Based German and the U.S. Financial System

    Get PDF
    In this paper, we address the question whether the impact of default risk on equity returns depends on the financial system firms operate in. Using an implementation of Merton's option-pricing model for the value of equity to estimate firms' default risk, we construct a factor that measures the excess return of firms with low default risk over firms with high default risk. We then compare results from asset pricing tests for the German and the U.S. stock markets. Since Germany is the prime example of a bank-based financial system, where debt is supposedly a major instrument of corporate governance, we expect that a systematic default risk effect on equity returns should be more pronounced for German rather than U.S. firms. Our evidence suggests that a higher firm default risk systematically leads to lower returns in both capital markets. This contradicts some previous results for the U.S. by Vassalou/Xing (2004), but we show that their default risk factor looses its explanatory power if one includes a default risk factor measured as a factor mimicking portfolio. It further turns out that the composition of corporate debt affects equity returns in Germany. Firms' default risk sensitivities are attenuated the more a firm depends on bank debt financing

    Modelling credit spreads with time volatility, skewness, and kurtosis

    Get PDF
    This paper seeks to identify the macroeconomic and financial factors that drive credit spreads on bond indices in the US credit market. To overcome the idiosyncratic nature of credit spread data reflected in time varying volatility, skewness and thick tails, it proposes asymmetric GARCH models with alternative probability density functions. The results show that credit spread changes are mainly explained by the interest rate and interest rate volatility, the slope of the yield curve, stock market returns and volatility, the state of liquidity in the corporate bond market and, a heretofore overlooked variable, the foreign exchange rate. They also confirm that the asymmetric GARCH models and Student-t distributions are systematically superior to the conventional GARCH model and the normal distribution in in-sample and out-of-sample testing

    Viral Load Levels Measured at Set-Point Have Risen Over the Last Decade of the HIV Epidemic in the Netherlands

    Get PDF
    HIV-1 RNA plasma concentration at viral set-point is associated not only with disease outcome but also with the transmission dynamics of HIV-1. We investigated whether plasma HIV-1 RNA concentration and CD4 cell count at viral set-point have changed over time in the HIV epidemic in the Netherlands.We selected 906 therapy-naïve patients with at least one plasma HIV-1 RNA concentration measured 9 to 27 months after estimated seroconversion. Changes in HIV-1 RNA and CD4 cell count at viral set-point over time were analysed using linear regression models. The ATHENA national observational cohort contributed all patients who seroconverted in or after 1996; the Amsterdam Cohort Studies (ACS) contributed seroconverters before 1996. The mean of the first HIV-1 RNA concentration measured 9-27 months after seroconversion was 4.30 log(10) copies/ml (95% CI 4.17-4.42) for seroconverters from 1984 through 1995 (n = 163); 4.27 (4.16-4.37) for seroconverters 1996-2002 (n = 232), and 4.59 (4.52-4.66) for seroconverters 2003-2007 (n = 511). Compared to patients seroconverting between 2003-2007, the adjusted mean HIV-1 RNA concentration at set-point was 0.28 log(10) copies/ml (95% CI 0.16-0.40; p<0.0001) and 0.26 (0.11-0.41; p = 0.0006) lower for those seroconverting between 1996-2002 and 1984-1995, respectively. Results were robust regardless of type of HIV-1 RNA assay, HIV-1 subtype, and interval between measurement and seroconversion. CD4 cell count at viral set-point declined over calendar time at approximately 5 cells/mm(3)/year.The HIV-1 RNA plasma concentration at viral set-point has increased over the last decade of the HIV epidemic in the Netherlands. This is accompanied by a decreasing CD4 cell count over the period 1984-2007 and may have implications for both the course of the HIV infection and the epidemic

    Study design and participant characteristics of a randomized controlled trial of directly administered antiretroviral therapy in opioid treatment programs

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>HIV-infected drug users are at higher risk of non-adherence and poor treatment outcomes than HIV-infected non-drug users. Prior work from our group and others suggests that directly administered antiretroviral therapy (DAART) delivered in opioid treatment programs (OTPs) may increase rates of viral suppression.</p> <p>Methods/Design</p> <p>We are conducting a randomized trial comparing DAART to self-administered therapy (SAT) in 5 OTPs in Baltimore, Maryland. Participants and investigators are aware of treatment assignments. The DAART intervention is 12 months. The primary outcome is HIV RNA < 50 copies/mL at 3, 6, and 12 months. To assess persistence of any study arm differences that emerge during the active intervention, we are conducting an 18-month visit (6 months after the intervention concludes). We are collecting electronic adherence data for 2 months in both study arms. Of 457 individuals screened, a total of 107 participants were enrolled, with 56 and 51 randomly assigned to DAART and SAT, respectively. Participants were predominantly African American, approximately half were women, and the median age was 47 years. Active use of cocaine and other drugs was common at baseline. HIV disease stage was advanced in most participants. The median CD4 count at enrollment was 207 cells/mm<sup>3</sup>, 66 (62%) had a history of an AIDS-defining opportunistic condition, and 21 (20%) were antiretroviral naïve.</p> <p>Conclusions</p> <p>This paper describes the rationale, methods, and baseline characteristics of subjects enrolled in a randomized clinical trial comparing DAART to SAT in opioid treatment programs.</p> <p>Trial Registration</p> <p>ClinicalTrials.gov: <a href="http://www.clinicaltrials.gov/ct2/show/NCT00279110">NCT00279110</a></p

    A Dose-Dependent Relationship between Exposure to a Street-Based Drug Scene and Health-Related Harms among People Who Use Injection Drugs

    Get PDF
    While the community impacts of drug-related street disorder have been well described, lesser attention has been given to the potential health and social implications of drug scene exposure on street-involved people who use illicit drugs. Therefore, we sought to assess the impacts of exposure to a street-based drug scene among injection drug users (IDU) in a Canadian setting. Data were derived from a prospective cohort study known as the Vancouver Injection Drug Users Study. Four categories of drug scene exposure were defined based on the numbers of hours spent on the street each day. Three generalized estimating equation (GEE) logistic regression models were constructed to identify factors associated with varying levels of drug scene exposure (2–6, 6–15, over 15 hours) during the period of December 2005 to March 2009. Among our sample of 1,486 IDU, at baseline, a total of 314 (21%) fit the criteria for high drug scene exposure (>15 hours per day). In multivariate GEE analysis, factors significantly and independently associated with high exposure included: unstable housing (adjusted odds ratio [AOR] = 9.50; 95% confidence interval [CI], 6.36–14.20); daily crack use (AOR = 2.70; 95% CI, 2.07–3.52); encounters with police (AOR = 2.11; 95% CI, 1.62–2.75); and being a victim of violence (AOR = 1.49; 95 % CI, 1.14–1.95). Regular employment (AOR = 0.50; 95% CI, 0.38–0.65), and engagement with addiction treatment (AOR = 0.58; 95% CI, 0.45–0.75) were negatively associated with high exposure. Our findings indicate that drug scene exposure is associated with markers of vulnerability and higher intensity addiction. Intensity of drug scene exposure was associated with indicators of vulnerability to harm in a dose-dependent fashion. These findings highlight opportunities for policy interventions to address exposure to street disorder in the areas of employment, housing, and addiction treatment

    Decolorization and partial mineralization of a polyazo dye by Bacillus firmus immobilized within tubular polymeric gel

    Get PDF
    The degradation of C.I. Direct red 80, a polyazo dye, was investigated using Bacillus firmus immobilized by entrapment in tubular polymeric gel. This bacterial strain was able to completely decolorize 50 mg/L of C.I. Direct red 80 under anoxic conditions within 12 h and also degrade the reaction intermediates (aromatic amines) during the subsequent 12 h under aerobic conditions. The tubular gel harboring the immobilized cells consisted of anoxic and aerobic regions integrated in a single unit which was ideal for azo dye degradation studies. Results obtained show that effective dye decolorization (97.8%), chemical oxygen demand (COD) reduction (91.7%) and total aromatic amines removal were obtained in 15 h with the immobilized bacterial cell system whereas for the free cells, a hydraulic residence time of 24 h was required for an equivalent performance in a sequential anoxic and aerobic process. Repeated-batch experiments indicate the immobilized cells could decolorize C.I. Direct red 80 and reduce medium COD in five successive batch runs with enhanced activity obtained after each consecutive run, thus suggesting its stability and potential for repeated use in wastewater treatment. UV–visible spectrophotometry and HPLC analysis were used to confirm the partial mineralization of the dye. Data from this study could be used as a reference for the development of effective industrial scale biotechnological process for the removal of dyes and their metabolites in textile wastewater

    Transcription-replication conflicts: How they occur and how they are resolved

    Get PDF
    The frequent occurrence of transcription and DNA replication in cells results in many encounters, and thus conflicts, between the transcription and replication machineries. These conflicts constitute a major intrinsic source of genome instability, which is a hallmark of cancer cells. How the replication machinery progresses along a DNA molecule occupied by an RNA polymerase is an old question. Here we review recent data on the biological relevance of transcription-replication conflicts, and the factors and mechanisms that are involved in either preventing or resolving them, mainly in eukaryotes. On the basis of these data, we provide our current view of how transcription can generate obstacles to replication, including torsional stress and non-B DNA structures, and of the different cellular processes that have evolved to solve them
    corecore