8,122 research outputs found

    TrusNet: Peer-to-Peer Cryptographic Authentication

    Get PDF
    Originally, the Internet was meant as a general purpose communication protocol, transferring primarily text documents between interested parties. Over time, documents expanded to include pictures, videos and even web pages. Increasingly, the Internet is being used to transfer a new kind of data which it was never designed for. In most ways, this new data type fits in naturally to the Internet, taking advantage of the near limit-less expanse of the protocol. Hardware protocols, unlike previous data types, provide a unique set security problem. Much like financial data, hardware protocols extended across the Internet must be protected with authentication. Currently, systems which do authenticate do so through a central server, utilizing a similar authentication model to the HTTPS protocol. This hierarchical model is often at odds with the needs of hardware protocols, particularly in ad-hoc networks where peer-to-peer communication is prioritized over a hierarchical model. Our project attempts to implement a peer-to-peer cryptographic authentication protocol to be used to protect hardware protocols extending over the Internet. The TrusNet project uses public-key cryptography to authenticate nodes on a distributed network, with each node locally managing a record of the public keys of nodes which it has encountered. These keys are used to secure data transmission between nodes and to authenticate the identities of nodes. TrusNet is designed to be used on multiple different types of network interfaces, but currently only has explicit hooks for Internet Protocol connections. As of June 2016, TrusNet has successfully achieved a basic authentication and communication protocol on Windows 7, OSX, Linux 14 and the Intel Edison. TrusNet uses RC-4 as its stream cipher and RSA as its public-key algorithm, although both of these are easily configurable. Along with the library, TrusNet also enables the building of a unit testing suite, a simple UI application designed to visualize the basics of the system and a build with hooks into the I/O pins of the Intel Edison allowing for a basic demonstration of the system

    Scientific Computing Meets Big Data Technology: An Astronomy Use Case

    Full text link
    Scientific analyses commonly compose multiple single-process programs into a dataflow. An end-to-end dataflow of single-process programs is known as a many-task application. Typically, tools from the HPC software stack are used to parallelize these analyses. In this work, we investigate an alternate approach that uses Apache Spark -- a modern big data platform -- to parallelize many-task applications. We present Kira, a flexible and distributed astronomy image processing toolkit using Apache Spark. We then use the Kira toolkit to implement a Source Extractor application for astronomy images, called Kira SE. With Kira SE as the use case, we study the programming flexibility, dataflow richness, scheduling capacity and performance of Apache Spark running on the EC2 cloud. By exploiting data locality, Kira SE achieves a 2.5x speedup over an equivalent C program when analyzing a 1TB dataset using 512 cores on the Amazon EC2 cloud. Furthermore, we show that by leveraging software originally designed for big data infrastructure, Kira SE achieves competitive performance to the C implementation running on the NERSC Edison supercomputer. Our experience with Kira indicates that emerging Big Data platforms such as Apache Spark are a performant alternative for many-task scientific applications

    DPP-PMRF: Rethinking Optimization for a Probabilistic Graphical Model Using Data-Parallel Primitives

    Full text link
    We present a new parallel algorithm for probabilistic graphical model optimization. The algorithm relies on data-parallel primitives (DPPs), which provide portable performance over hardware architecture. We evaluate results on CPUs and GPUs for an image segmentation problem. Compared to a serial baseline, we observe runtime speedups of up to 13X (CPU) and 44X (GPU). We also compare our performance to a reference, OpenMP-based algorithm, and find speedups of up to 7X (CPU).Comment: LDAV 2018, October 201

    Pace Energy & Climate Center 2016 Annual Report

    Get PDF
    The Center staff and many allies are deeply involved in the business of electric utility transformation. We live and work in a remarkable time. Decades of steady, thoughtful leadership on clean energy issues is now bearing fruit. Clean energy is not just the right thing to do, it is increasingly recognized as the right choice economically, technically, and for all members of society. Our work, especially in 2016, has been about making sure that we seize the moment and secure the benefits of clean energy use for all communities in New York, the Northeast U.S., across the country, and throughout the world. Never has it been more important that we succeed in our work. The challenges of climate change, the changing path of policy, and the moral imperative of building a clean energy foundation for future generations drive us every day. While we don’t work actively in Washington, D.C., changes there threaten our work. The Center focuses on waging a strong offense at the state and community level, and on effectively communicating the benefits of clean energy development and policy. In 2016, we answered the call for clear-eyed policy leadership in the many New York Public Service Commission’s (“NYPSC”) Reforming the Energy Vision (“REV”) initiative proceedings under way. Our work multiplied as the Commission transitioned from vision to implementation and execution, and so did our impact. See the Appendix for the active NY PSC proceedings in which the Center is engaged! The Center continued its regional leadership as a champion of super-efficient combined heat and power, strong solar energy market policy, and interstate cooperation to reduce greenhouse gas emissions. We expand the reach of our ideas and support through formal regulatory interventions, thought leadership, and good old-fashioned research and writing. The Pace Energy and Climate Center continues to operate as a small, agile, interdisciplinary team of talented and committed individuals, and continues to benefit from the support of the best law student interns anywhere. Our network of collaborators at other organizations has grown over the year, as has our reputation in the media

    FogGIS: Fog Computing for Geospatial Big Data Analytics

    Full text link
    Cloud Geographic Information Systems (GIS) has emerged as a tool for analysis, processing and transmission of geospatial data. The Fog computing is a paradigm where Fog devices help to increase throughput and reduce latency at the edge of the client. This paper developed a Fog-based framework named Fog GIS for mining analytics from geospatial data. We built a prototype using Intel Edison, an embedded microprocessor. We validated the FogGIS by doing preliminary analysis. including compression, and overlay analysis. Results showed that Fog computing hold a great promise for analysis of geospatial data. We used several open source compression techniques for reducing the transmission to the cloud.Comment: 6 pages, 4 figures, 1 table, 3rd IEEE Uttar Pradesh Section International Conference on Electrical, Computer and Electronics (09-11 December, 2016) Indian Institute of Technology (Banaras Hindu University) Varanasi, Indi

    Blockchain-based Decentralized Distribution Management in E-Journals

    Get PDF
    The application of blockchain in the context of E-Journal distribution to journalists is aimed at making the management paper adequately distributed and not misused. The security system in the distribution or management paper process of an open journal system is currently considered to be very lacking because one can duplicate the journal in an open journal system easily. Furthermore, it can be transferred to anyone who is not responsible. The security system in the distribution of an open journal system and the management of the management paper process is currently considered to be very lacking because one can duplicate the journal in an open journal system easily. Furthermore, it can be transferred to anyone who is not responsible. With the implementation of this blockchain technology, there are 3 (three) benefits, namely (1) The distribution of E-Journal in the Open Journal System is more targeted, and there are no errors. (2) The reputation of the Open Journal System becomes better with a sense of trust. This research will be implemented in an E-Journal in an Open Journal System using blockchain technology. (3) The management paper processing in the open journal system runs according to the procedure so that in the management process the distribution of soft copies and hard copies of the journal is protected from hacker threats, and this blockchain is used to guarantee its security

    Designing plans for organizational development, lessons from three large-scale SME-initiatives

    Get PDF
    This paper reflects upon the way that we balanced design and development in three specific case projects in order to contribute to creating and accumulating knowledge that is both relevant to practitioners and academics. More specifically, it is shown how learning within one project was used to improve the design of the next project. The three projects were set up in the context of government-sponsored social science programs and aimed at improving innovation in SMEs. As the paper shows, looking at these three projects shows the contribution from seeing design and development as two sides of the same coin.design-oriented research, organizational development, large-scale projects

    Benchmarking Utility Clean Energy Deployment: 2016

    Get PDF
    Benchmarking Utility Clean Energy Deployment: 2016 provides a window into how the global transition toward clean energy is playing out in the U.S. electric power sector. Specifically, it reveals the extent to which 30 of the largest U.S. investor-owned electric utility holding companies are increasingly deploying clean energy resources to meet customer needs.Benchmarking these companies provides an opportunity for transparent reporting and analysis of important industry trends. It fills a knowledge gap by offering utilities, regulators, investors, policymakers and other stakeholders consistent and comparable information on which to base their decisions. And it provides perspective on which utilities are best positioned in a shifting policy landscape, including likely implementation of the U.S. EPA's Clean Power Plan aimed at reducing carbon pollution from power plants

    Open Innovation Platform Design: The Case of Social Product Development

    Get PDF
    Open Innovation as a new product development strategy has been used by businesses for decades. However, Social Product Development (SPD) has been recently introduced and popularized as an open innovation business model. The SPD model formalizes and monetizes the collaboration between an organization and creative communities through introducing new products and services. Either managed by intermediaries or directly by innovation sponsors, SPD platforms enable and support online innovative communities to ideate, collaborate, and network. Despite their abilities, many of these platforms do not provide fulfilling user experiences. To bridge this gap, the present study focuses on how SPD platform developers can offer more robust user interfaces (UI) and engaging user experiences (UX) alongside the six key SPD processes—social engagement, ideation, experiential communication, social validation, co-development, and co-commercialization. Building on experience and affordances theories, we offer a design framework that can more broadly inform the design and evaluation of open innovation platforms
    • 

    corecore