13,417 research outputs found

    Efficient estimation of the distribution of time to composite endpoint when some endpoints are only partially observed.

    No full text
    Two common features of clinical trials, and other longitudinal studies, are (1) a primary interest in composite endpoints, and (2) the problem of subjects withdrawing prematurely from the study. In some settings, withdrawal may only affect observation of some components of the composite endpoint, for example when another component is death, information on which may be available from a national registry. In this paper, we use the theory of augmented inverse probability weighted estimating equations to show how such partial information on the composite endpoint for subjects who withdraw from the study can be incorporated in a principled way into the estimation of the distribution of time to composite endpoint, typically leading to increased efficiency without relying on additional assumptions above those that would be made by standard approaches. We describe our proposed approach theoretically, and demonstrate its properties in a simulation study

    On the Interpretation of Top Partners Searches

    Full text link
    Relatively light Top Partners are unmistakable signatures of reasonably Natural Composite Higgs models and as such they are worth searching for at the LHC. Their phenomenology is characterized by a certain amount of model-dependence, which makes the interpretation of Top Partner experimental searches not completely straightforward especially if one is willing to take also single production into account. We describe a model-independent strategy by which the interpretation is provided on the parameter space of a Simplified Model that captures the relevant features of all the explicit constructions. The Simplified Model limits are easy to interpret within explicit models, in a way that requires no recasting and no knowledge of the experimental details of the analyses. We illustrate the method by concrete examples, among which the searches for a charge 5/3 Partner in same-sign dileptons and the searches for a charge 2/3 singlet. In each case we perform a theory recasting of the available 8 TeV Run-1 results and an estimate of the 13 TeV Run-2 reach, also including the effect of single production for which dedicated experimental analyses are not yet available. A rough assessment of the reach of a hypothetical 100 TeV collider is also provided

    Using grounded theory to understand software process improvement: A study of Irish software product companies

    Get PDF
    Software Process Improvement (SPI) aims to understand the software process as it is used within an organisation and thus drive the implementation of changes to that process to achieve specific goals such as increasing development speed, achieving higher product quality or reducing costs. Accordingly, SPI researchers must be equipped with the methodologies and tools to enable them to look within organisations and understand the state of practice with respect to software process and process improvement initiatives, in addition to investigating the relevant literature. Having examined a number of potentially suitable research methodologies, we have chosen Grounded Theory as a suitable approach to determine what was happening in actual practice in relation to software process and SPI, using the indigenous Irish software product industry as a test-bed. The outcome of this study is a theory, grounded in the field data, that explains when and why SPI is undertaken by the software industry. The objective of this paper is to describe both the selection and usage of grounded theory in this study and evaluate its effectiveness as a research methodology for software process researchers. Accordingly, this paper will focus on the selection and usage of grounded theory, rather than results of the SPI study itself

    Applying Formal Methods to Networking: Theory, Techniques and Applications

    Full text link
    Despite its great importance, modern network infrastructure is remarkable for the lack of rigor in its engineering. The Internet which began as a research experiment was never designed to handle the users and applications it hosts today. The lack of formalization of the Internet architecture meant limited abstractions and modularity, especially for the control and management planes, thus requiring for every new need a new protocol built from scratch. This led to an unwieldy ossified Internet architecture resistant to any attempts at formal verification, and an Internet culture where expediency and pragmatism are favored over formal correctness. Fortunately, recent work in the space of clean slate Internet design---especially, the software defined networking (SDN) paradigm---offers the Internet community another chance to develop the right kind of architecture and abstractions. This has also led to a great resurgence in interest of applying formal methods to specification, verification, and synthesis of networking protocols and applications. In this paper, we present a self-contained tutorial of the formidable amount of work that has been done in formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial
    • 

    corecore