27,869 research outputs found

    Focus on sharing individual patient data distracts from other ways of improving trial transparency

    Get PDF
    The International Committee of Medical Journal Editors (ICMJE) recently reiterated its commitment to improving trial transparency by sharing individual patient data from randomised trials.1 2 But, although sharing individual patient data contributes to transparency, it is not sufficient by itself. Trial transparency requires a data sharing package, which begins with trial registration and contains other elements such as protocols, summary results, and other trial materials. Valuable as sharing individual patient data can be,3 discussion about it has hijacked the broader conversation about data sharing and trial transparency.4-6 For example, we identified 76 articles published in the six leading general medical journals that had “data” and “sharing” in their title and were about clinical trials. In 64 (84%) articles, the content was focused on individual patient data and did not mention any of the other components of trial transparency (see appendix on bmj.com). Much of the discussion has focused on the complexities and practical problems associated with sharing individual patient data and on the processes and systems needed for responsible data sharing.6-9 However, many of the data sharing activities that are needed for trial transparency are not complex. We believe that trying to solve the complex issues around availability of individual patient data should not eclipse or distract from a more pressing problem: the unavailability of even summary data and protocols from all controlled trials. Current estimates are that around 85% of research is avoidably “wasted” because of design flaws, poor conduct, non-publication, and poor reporting.10 Focusing efforts and attention on making individual patient data accessible might paradoxically exacerbate this waste in research. We argue that simpler and more cost efficient activities should be prioritised.</p

    A Taxonomy of Data Grids for Distributed Data Sharing, Management and Processing

    Full text link
    Data Grids have been adopted as the platform for scientific communities that need to share, access, transport, process and manage large data collections distributed worldwide. They combine high-end computing technologies with high-performance networking and wide-area storage management techniques. In this paper, we discuss the key concepts behind Data Grids and compare them with other data sharing and distribution paradigms such as content delivery networks, peer-to-peer networks and distributed databases. We then provide comprehensive taxonomies that cover various aspects of architecture, data transportation, data replication and resource allocation and scheduling. Finally, we map the proposed taxonomy to various Data Grid systems not only to validate the taxonomy but also to identify areas for future exploration. Through this taxonomy, we aim to categorise existing systems to better understand their goals and their methodology. This would help evaluate their applicability for solving similar problems. This taxonomy also provides a "gap analysis" of this area through which researchers can potentially identify new issues for investigation. Finally, we hope that the proposed taxonomy and mapping also helps to provide an easy way for new practitioners to understand this complex area of research.Comment: 46 pages, 16 figures, Technical Repor

    Using the TIDieR checklist to standardize the description of a functional strength training intervention for the upper limb after stroke

    Get PDF
    Background and Purpose: Published reports of intervention in randomized controlled trials are often poorly described. The Template for Intervention Description and Replication (TIDieR) checklist has been recently developed to improve the reporting of interventions. The aim of this article is to describe a therapy intervention used in the stroke rehabilitation trial, "Clinical Efficacy of Functional Strength Training for Upper Limb Motor Recovery Early After Stroke: Neural Correlates and Prognostic Indicators" (FAST-INdICATE), using TIDieR. Methods: The functional strength training intervention used in the FAST-INdICATE trial was described using TIDieR so that intervention can be replicated by both clinicians, who may implement it in practice, and researchers, who may deliver it in future research. The usefulness of TIDieR in the context of a complex stroke rehabilitation intervention was then discussed. Results and Discussion: The TIDieR checklist provided a systematic way of describing a treatment intervention used in a clinical trial of stroke rehabilitation. Clarification is needed regarding several aspects of the TIDieR checklist, including in which section to report about the development of the intervention in pilot studies, results of feasibility studies; overlap between training and procedures for assessing fidelity; and where to publish supplementary material so that it remains in the public domain. Summary: TIDieR is a systematic way of reporting the intervention delivered in a clinical trial of a complex intervention such as stroke rehabilitation. This approach may also have value for standardizing intervention in clinical practice. Video abstract is available for more insights from the authors (see Supplemental Digital Content 1, http://links.lww.com/JNPT/A131)

    Connected by 25: Effective Policy Solutions for Vulnerable Youth Issue Brief

    Get PDF
    In an effort to strengthen philanthropic investments among its membership, the Youth Transition Funders Group (YTFG) asked a group of policy experts to provide recommendations on how foundations can work to encourage effective policy solutions on issues affecting youth in transition to adulthood. The primary challenge was to think beyond the systemic silos that so deeply shape the services and expectations of youth and move towards an overall framework that could produce improved outcomes. YTFG's work is based on the Connected by 25 framework, in which all youth reach the following outcomes by age 25: Educational achievement in preparation for career and community participation, including a high school diploma, post-secondary degree, and/or vocational certificate trainingGainful employment and/or access to career training to achieve life-long economic successConnections to a positive support system -- namely, guidance from family members and caring adults, as well as access to health, counseling, and mental health servicesThe ability to be a responsible and nurturing parentThe capacity to be actively engaged in the civic life of one's community This issue brief offers a summary of those recommendations, focusing on four primary transition points that often threaten the ability for youth to be connected by age 25 to the institutions and support systems that help them succeed throughout life

    Datacenter Traffic Control: Understanding Techniques and Trade-offs

    Get PDF
    Datacenters provide cost-effective and flexible access to scalable compute and storage resources necessary for today's cloud computing needs. A typical datacenter is made up of thousands of servers connected with a large network and usually managed by one operator. To provide quality access to the variety of applications and services hosted on datacenters and maximize performance, it deems necessary to use datacenter networks effectively and efficiently. Datacenter traffic is often a mix of several classes with different priorities and requirements. This includes user-generated interactive traffic, traffic with deadlines, and long-running traffic. To this end, custom transport protocols and traffic management techniques have been developed to improve datacenter network performance. In this tutorial paper, we review the general architecture of datacenter networks, various topologies proposed for them, their traffic properties, general traffic control challenges in datacenters and general traffic control objectives. The purpose of this paper is to bring out the important characteristics of traffic control in datacenters and not to survey all existing solutions (as it is virtually impossible due to massive body of existing research). We hope to provide readers with a wide range of options and factors while considering a variety of traffic control mechanisms. We discuss various characteristics of datacenter traffic control including management schemes, transmission control, traffic shaping, prioritization, load balancing, multipathing, and traffic scheduling. Next, we point to several open challenges as well as new and interesting networking paradigms. At the end of this paper, we briefly review inter-datacenter networks that connect geographically dispersed datacenters which have been receiving increasing attention recently and pose interesting and novel research problems.Comment: Accepted for Publication in IEEE Communications Surveys and Tutorial

    DCCast: Efficient Point to Multipoint Transfers Across Datacenters

    Get PDF
    Using multiple datacenters allows for higher availability, load balancing and reduced latency to customers of cloud services. To distribute multiple copies of data, cloud providers depend on inter-datacenter WANs that ought to be used efficiently considering their limited capacity and the ever-increasing data demands. In this paper, we focus on applications that transfer objects from one datacenter to several datacenters over dedicated inter-datacenter networks. We present DCCast, a centralized Point to Multi-Point (P2MP) algorithm that uses forwarding trees to efficiently deliver an object from a source datacenter to required destination datacenters. With low computational overhead, DCCast selects forwarding trees that minimize bandwidth usage and balance load across all links. With simulation experiments on Google's GScale network, we show that DCCast can reduce total bandwidth usage and tail Transfer Completion Times (TCT) by up to 50%50\% compared to delivering the same objects via independent point-to-point (P2P) transfers.Comment: 9th USENIX Workshop on Hot Topics in Cloud Computing, https://www.usenix.org/conference/hotcloud17/program/presentation/noormohammadpou

    The Pretoria Statement on the Future of African Agriculture

    Get PDF
    "On December 1–3, 2003, the New Partnership for Africa's Development (NEPAD), Capacity Building International, Germany (InWent), the Technical Center for Agricultural and Rural Cooperation (CTA), and the International Food Policy Research Institute (IFPRI) assembled a group of experienced agricultural, trade, and finance specialists from government and the private sector and from across Africa to help review, summarize, and distill conclusions from the case studies of African successes. Together, these 70 specialists produced a shared statement of findings identifying priorities for future policy action necessary to trigger sustained agricultural growth in Africa. That shared statement, the Pretoria Statement, provides the best available summary of key lessons learned on how to scale up agricultural successes for the future" from Text. In this brief, we learn that a series of successful episodes in African agriculture suggests two fundamental prerequisites for sustained agricultural growth (Good governance and Sustained funding for agricultural research extension) as well as a number of promising specific opportunities: Soil and water conservation; Replication of proven commodity-specific breeding and processing successes; Marketing and information systems; Vertical supply chains; and Regional cooperation in trade and agricultural technology.
    • …
    corecore