82 research outputs found

    Efficient design and analysis of extended case-control studies

    Get PDF
    The nested case-control design is widely used in epidemiology for its efficiency, as it combines the advantages of both cohort and case-control designs. This design is an extension of the matched case-control design, where the matching variable is the time of occurrence of the outcome. Consequently, the nested case-control data are usually analysed with conditional logistic regression; however, this analysis suffers from various limitations. Several authors have developed novel statistical methods for alternative analyses of nested case-control data using basic information from the underlying cohort. Among these methods, one approach consists of ignoring the matching, weighting the sampled individuals to recover a representation of the underlying cohort and analysing the data by maximising a weighted partial likelihood. This method can be considered when two conditions are fulfilled: 1) the sampling was performed in a well-defined underlying cohort for which basic information is available, and 2) the exact sampling procedure is known. This thesis aimed to refine and extend the scope of the weighted likelihood approach in nested case-control data analysis by investigating the advantages of this method as an alternative to the traditional conditional logistic regression in several situations. The reuse of nested case-control data to address a research question regarding a new outcome, the calculation of absolute risk, the mitigation of the problem of overmatching, the maximisation of the data exploitation in case of clustered data and the analysis of subgroups of nested case-control data were addressed in this thesis. While Studies I and III were motivated by an actual epidemiological question for which data were available, simulation studies were the main approach used in Studies II and IV. Reusing nested case-control data to address a research question regarding another outcome was the central point of interest in Study I. Addressing an epidemiological question regarding the risk factors for contralateral breast cancer, for which data on contralateral breast cancer case patients were available, the feasibility of reusing nested case-control data from a previous study as the control dataset was studied. Practical aspects of the approach were highlighted, such as the consequences of reusing data which have narrow inclusion criteria, the restriction in the choice of the type of weights which can be calculated and the importance of having information on censoring dates for controls. In addition, we found that an imperfect reconstruction of the study base led to similar estimates in the analysis compared to an appropriate study base reconstruction; moreover, we confirmed that using unstratified weights (in cases of stratified sampling) provided similar exposure estimates than stratified weights, provided that adjustments were made on the confounder variables which drove the sampling. We also confirmed that using a naïve unweighted method instead of an appropriate method led to biased estimates. Absolute risk estimation was studied in Study II. Two methods were compared with both simulation studies and a real data application. The ability of each method to provide valid absolute risk estimates was investigated, in particular in cases of matched study designs. Both the Langholz-Borgan and weighted methods provided valid estimates in most situations, the latter showing slightly higher levels of precision than the former. In case of fine matching, the Langholz-Borgan method was more prone to be biased than the weighted method and had larger standard errors. In Study III, we handled nested case-control data, which had been collected to address an epidemiological question regarding how radiation therapy and smoking interact in their association with lung cancer in female breast cancer patients. Data on paired organs (breast and lungs) were collected for exposure and outcome variables, which provided clustered data at the individual level. The collected data was also characterised by the problem of overmatching which arose at the design stage. Using weighted partial likelihood allowed mitigation of the problem of overmatching and better exploited the collected data, compared to conditional logistic regression. In addition, a further advantage of the weighted approach was to enable calculating the absolute risk for a lung to develop cancer given the radiation therapy dose received for breast cancer treatment and the smoking habits of the patient. In Study IV, we compared the conditional logistic regression and weighted likelihood methods in terms of validity and efficiency of nested case-control data subgroup analyses, with subgroups defined by different variables measured at baseline. All investigated subgroup analyses provided valid estimates with both analyses. The advantages of weighted likelihood compared to conditional logistic regression were highlighted for the estimate’s precision. In addition, we showed that the weighting system enabled, on average, the reconstruction of the correct number of individuals at risk over time, for the whole cohort and in subgroups. In conclusion, the weighted likelihood approach showed several advantages compared to the traditional conditional logistic regression in nested case-control data analysis, which reinforces, refines and extends what has been previously shown in the literature

    Thermodynamic Limit and Propagation of Chaos in Polling Networks

    Get PDF
    Projet MEVAL{{\P\n,¸N\geq 1 } is a sequence of standard polling networks, consisting of NN nodes attended by V\n mobile servers. When a server arrives at a node ii, he serves one of the waiting customers, if any, and then moves to node jj with probability p_{ij}\n. Customers arrive according to a Poisson process. Service requirements and switch-over times between nodes are independent exponentially distributed random variables. The behavior of \P\n is analyzed in {\em thermodynamic limit}, i.e when both NN and V\n tend to infinity, with $U\egaldef\lim_{N\rightarrow\infty}V\n/N,\

    Integration of streaming services and TCP data transmission in the Internet

    Get PDF
    We study in this paper the integration of elastic and streaming traffic on a same link in an IP network. We are specifically interested in the computation of the mean bit rate obtained by a data transfer. For this purpose, we consider that the bit rate offered by streaming traffic is low, of the order of magnitude of a small parameter \eps \ll 1 and related to an auxiliary stationary Markovian process (X(t)). Under the assumption that data transfers are exponentially distributed, arrive according to a Poisson process, and share the available bandwidth according to the ideal processor sharing discipline, we derive the mean bit rate of a data transfer as a power series expansion in \eps. Since the system can be described by means of an M/M/1 queue with a time-varying server rate, which depends upon the parameter \eps and process (X(t)), the key issue is to compute an expansion of the area swept under the occupation process of this queue in a busy period. We obtain closed formulas for the power series expansion in \eps of the mean bit rate, which allow us to verify the validity of the so-called reduced service rate at the first order. The second order term yields more insight into the negative impact of the variability of streaming flows

    Large Deviations Problems for Star Networks: the Min Policy Part I: Finite Time

    Get PDF
    Projet MEVALIn this paper, we prove a sample path large deviation principle for a rescaled process n1Qntn^{-1}Q_{nt}, where QtQ_t represents the joint number of connections at time tt in a star network where the bandwidth is shared between customers according to the so-called min policy. The rate function is computed explicitly. One of the main steps consists in deriving large deviation bounds for an empirical generator constructed from the join number of customers and arrivals on each route. The rest of the analysis relies on a suitable change of measure together with a localization procedure

    Large deviations for a class of Markov processes modelling communication networks

    Get PDF
    In this paper, we prove a sample path large deviation principle (LDP) for a rescaled process n^-1Q_nt, where Q_t is a multi-dimensional birth and death process describing the evolution of a communication network. In this setting, Q_t is the join number of documents on the set of routes at time t. Documents to be transferred arrive on route r as a Poisson process with rate _r and are transferred at rate _r_r(x) where x represents the state of the network, _r^-1 is the mean size of documents on route r and _r(x) is the bandwidth allocated to route r. We describe a set of assumptions over the allocation under which the LDP holds. Since we want the «classical» allocatio- ns to verify these assumptions, the difficulty is to deal with weak properties- . For example, _r(x) is assumed to be continuous on the set _r=x:x_r>0 but may be discontinuous elsewhere. Several examples are provided including the max-min-fairness allocation, a classical one in the context of data networks. Since the main object to work with is the local rate function, a great care has been devoted to its expression and its properties. It is expressed as the solution of a convex program from which many useful properties are derived. We believe that this kind of expression allows numerical computations

    On Polling Systems where Servers wait for Customers

    Get PDF
    Projet MEVALIn this paper, a particular polling system with NN queues and VV servers is analyzed. Whenever a server visits an empty queue, it waits for the next customer to come to this queue. A customer chooses his destination according to a routing matrix PP. The model originates from specific problems arising in transportation networks. A global classification of the process describing the system is given under general assumptions. It is shown that this process can only be {\em transient} or {\em null recurrent}. In addition, a detailed classification of each node, together with limit laws (after proper time-scaling) are obtained. The method of analysis rel= ies on the central limit theorem and a coupling with a reference system in which transportation times are identically zero

    Does autoimmune thyroid disease affects rheumatoid arthritis disease activity or response to methotrexate?

    Get PDF
    Publisher's version (útgefin grein)Objective: To investigate if autoimmune thyroid disease (AITD) impacts rheumatoid arthritis (RA) disease activity or response to methotrexate. Methods A nationwide register-based cohort study of 9 004 patients with new-onset RA from the Swedish Rheumatology Quality Register year 2006-2016, with linkage to other nationwide registers to identify comorbidity with AITD defined as thyroxine prescription before RA diagnosis, excluding non-autoimmune causes. We compared RA disease activity using 28-joint Disease Activity Score (DAS28) and its components, and EULAR response, between patients with and without AITD, using logistic regression. Results At diagnosis, patient reported outcome measures (PROMs; patient global, Health Assessment Questionnaire Disability Index and pain) but not objective disease activity measures (erythrocyte sedimentation rate and swollen joint count) were significantly higher (p<0.05 for all PROMs) among RA patients with AITD compared with those without. The level of DAS28 was 5.2 vs 5.1. By contrast, AITD had little influence on EULAR response to methotrexate at 3 months (OR of non/moderate response=0.95, 95% CI 0.8 to 1.1), nor at 6 months. When stratified by age, however, AITD was more common among EULAR non/moderate responders at 3 and 6 months in patients below 45 years resulting in ORs of non/moderate response of 1.44 (0.76-2.76) and 2.75 (1.04-7.28). Conclusion At diagnosis, RA patients with concomitant AITD score worse on patient reported but not on objective RA disease activity measures, while DAS28 was only marginally elevated. The overall chance of achieving a EULAR good response at 3 or 6 months remains unaffected, although among a limited subgroup of younger patients, AITD may be a predictor for an inferior primary response. © Author(s) (or their employer(s))This work was supported by research grants from the Swedish Research Council, the Swedish Cancer Society, the Swedish HeartLung Foundation, Nordforsk, Vinnova and FOREUM. Financial support information: Dr Askling has acted or acts as PI in agreements between Karolinska Institutet and the following entities, mainly related to the safety monitoring of immunomodulators in rheumatology: Abbvie, BMS, Eli Lilly, Merck, Pfizer, Roche, Samsung Bioepis, Sanofi.Peer Reviewe

    Short-term, intermediate-term and long-term risks of acute coronary syndrome in cohorts of patients with RA starting biologic DMARDs : results from four Nordic countries

    Get PDF
    Objectives To compare the 1-year, 2-year and 5-year incidences of acute coronary syndrome (ACS) in patients with rheumatoid arthritis (RA) starting any of the biologic disease-modifying antirheumatic drugs (bDMARDs) currently available in clinical practice and to anchor these results with a general population comparator. Methods Observational cohort study, with patients from Denmark, Finland, Norway and Sweden starting a bDMARD during 2008-2017. Time to first ACS was identified through register linkages. We calculated the 1-year, 2-year and 5-year incidence rates (IR) (on drug and ever since treatment start) and used Cox regression (HRs) to compare ACS incidences across treatments taking ACS risk factors into account. Analyses were further performed separately in subgroups defined by age, number of previous bDMARDs and history of cardiovascular disease. We also compared ACS incidences to an individually matched general population cohort. Results 24 083 patients (75% women, mean age 56 years) contributing 40 850 treatment courses were included. During the maximum (5 years) follow-up (141 257 person-years (pyrs)), 780 ACS events occurred (crude IR 5.5 per 1000 pyrs). Overall, the incidence of ACS in RA was 80% higher than that in the general population. For all bDMARDs and follow-up definitions, HRs were close to 1 (etanercept as reference) with the exception of the 5-year risk window, where signals for abatacept, infliximab and rituximab were noted. Conclusion The rate of ACS among patients with RA initiating bDMARDs remains elevated compared with the general population. As used in routine care, the short-term, intermediate-term and longer-term risks of ACS vary little across individual bDMARDs.Peer reviewe
    corecore