34,052 research outputs found

    Disaggregating non-volatile memory for throughput-oriented genomics workloads

    Get PDF
    Massive exploitation of next-generation sequencing technologies requires dealing with both: huge amounts of data and complex bioinformatics pipelines. Computing architectures have evolved to deal with these problems, enabling approaches that were unfeasible years ago: accelerators and Non-Volatile Memories (NVM) are becoming widely used to enhance the most demanding workloads. However, bioinformatics workloads are usually part of bigger pipelines with different and dynamic needs in terms of resources. The introduction of Software Defined Infrastructures (SDI) for data centers provides roots to dramatically increase the efficiency in the management of infrastructures. SDI enables new ways to structure hardware resources through disaggregation, and provides new hardware composability and sharing mechanisms to deploy workloads in more flexible ways. In this paper we study a state-of-the-art genomics application, SMUFIN, aiming to address the challenges of future HPC facilities.This work is partially supported by the European Research Council (ERC) under the EU Horizon 2020 programme (GA 639595), the Spanish Ministry of Economy, Industry and Competitivity (TIN2015-65316-P) and the Generalitat de Catalunya (2014-SGR-1051).Peer ReviewedPostprint (author's final draft

    Harnessing Collaborative Technologies: Helping Funders Work Together Better

    Get PDF
    This report was produced through a joint research project of the Monitor Institute and the Foundation Center. The research included an extensive literature review on collaboration in philanthropy, detailed analysis of trends from a recent Foundation Center survey of the largest U.S. foundations, interviews with 37 leading philanthropy professionals and technology experts, and a review of over 170 online tools.The report is a story about how new tools are changing the way funders collaborate. It includes three primary sections: an introduction to emerging technologies and the changing context for philanthropic collaboration; an overview of collaborative needs and tools; and recommendations for improving the collaborative technology landscapeA "Key Findings" executive summary serves as a companion piece to this full report

    Datacenter Traffic Control: Understanding Techniques and Trade-offs

    Get PDF
    Datacenters provide cost-effective and flexible access to scalable compute and storage resources necessary for today's cloud computing needs. A typical datacenter is made up of thousands of servers connected with a large network and usually managed by one operator. To provide quality access to the variety of applications and services hosted on datacenters and maximize performance, it deems necessary to use datacenter networks effectively and efficiently. Datacenter traffic is often a mix of several classes with different priorities and requirements. This includes user-generated interactive traffic, traffic with deadlines, and long-running traffic. To this end, custom transport protocols and traffic management techniques have been developed to improve datacenter network performance. In this tutorial paper, we review the general architecture of datacenter networks, various topologies proposed for them, their traffic properties, general traffic control challenges in datacenters and general traffic control objectives. The purpose of this paper is to bring out the important characteristics of traffic control in datacenters and not to survey all existing solutions (as it is virtually impossible due to massive body of existing research). We hope to provide readers with a wide range of options and factors while considering a variety of traffic control mechanisms. We discuss various characteristics of datacenter traffic control including management schemes, transmission control, traffic shaping, prioritization, load balancing, multipathing, and traffic scheduling. Next, we point to several open challenges as well as new and interesting networking paradigms. At the end of this paper, we briefly review inter-datacenter networks that connect geographically dispersed datacenters which have been receiving increasing attention recently and pose interesting and novel research problems.Comment: Accepted for Publication in IEEE Communications Surveys and Tutorial

    Social Media for Cities, Counties and Communities

    Get PDF
    Social media (i.e., Twitter, Facebook, Flickr, YouTube) and other tools and services with user- generated content have made a staggering amount of information (and misinformation) available. Some government officials seek to leverage these resources to improve services and communication with citizens, especially during crises and emergencies. Yet, the sheer volume of social data streams generates substantial noise that must be filtered. Potential exists to rapidly identify issues of concern for emergency management by detecting meaningful patterns or trends in the stream of messages and information flow. Similarly, monitoring these patterns and themes over time could provide officials with insights into the perceptions and mood of the community that cannot be collected through traditional methods (e.g., phone or mail surveys) due to their substantive costs, especially in light of reduced and shrinking budgets of governments at all levels. We conducted a pilot study in 2010 with government officials in Arlington, Virginia (and to a lesser extent representatives of groups from Alexandria and Fairfax, Virginia) with a view to contributing to a general understanding of the use of social media by government officials as well as community organizations, businesses and the public. We were especially interested in gaining greater insight into social media use in crisis situations (whether severe or fairly routine crises, such as traffic or weather disruptions)

    Broadcasting to the masses or building communities: Polish political parties online communication during the 2011 election

    Get PDF
    The professionalisation of political communication is an evolutionary process (Lilleker & Negrine, 2002), a process that adapts to trends in communication in order to better engage and persuade the public. One of the most dramatic developments in communication has been the move towards social communication via the Internet. It is argued to affect every area of public communication, from commercial advertising and public relations to education (Macnamara, 2010). It is no longer sufficient to have an online presence; we are now in an age of i-branding; with the ‘i’ standing for interactive. Yet, trends in online political electoral campaigning over recent years indicate a shallow adoption of Web 2.0 tools, features and platforms; limited interactivity; and managed co-production. The Internet is now embedded as a campaigning tool however, largely, the technologies are adapted to the norms of political communication rather than technologies impacting upon internal organizational structures, party relationships to members and supporters, or the content and style of their communication. We examine these themes, and develop them through a focus on the targeting and networking strategies of political parties, in more detail in the context of the Polish parliamentary election of 2011. Through a sophisticated content analysis and coding scheme our paper examines the extent to which parties use features that are designed to inform, engage, mobilise or allow interaction, which audiences they seek to communicate with and how these fit communication strategies. Comparing these findings with maps built from webcrawler analysis we build a picture of the strategies of the parties and the extent to which this links to short and long term political goals. This paper firstly develops our rationale for studying party and candidate use of the Internet during elections within the Polish context. Secondly we develop a conceptual framework which contrasts the politics as usual thesis (Margolis & Resnick, 2000) with arguments surrounding the social shaping of technologies (Lievrouw, 2006) and the impact on organisational adoption of communication technologies and post-Obama trends in Internet usage (Lilleker & Jackson, 2011) and posit that, despite the threats from an interactive strategy (Stromer-Galley, 2000) one would be expected within the context of a networked society (Van Dyjk, 2006). Following an overview of our methodology and innovative analysis strategy, we present our data which focuses on three key elements. Firstly we focus on the extent to which party and candidate websites inform, engage, mobilise or permit interaction (Lilleker et al, 2011). Secondly we assess the extent to which websites attract different visitor groups (Lilleker & Jackson, 2011) and build communities (Lilleker & Koc-Michalska, 2012). Thirdly we assess the reach strategies of the websites using Webcrawler technology which analyses the use of hyperlinks and whether parties lock themselves within cyberghettoes (Sunstein, 2007) or attempt to harness the power of the network (Benkler, 2006)
    • 

    corecore