15,985 research outputs found

    A Cost Benefit Model for Systematic Software Reuse

    Get PDF
    Information systems development is typically acknowledged as an expensive and lengthy process, often producing code that is of uneven quality and difficult to maintain. Software reuse has been advocated as a means of revolutionizing this process. The claimed benefits from software reuse are reduction in development cost and time, improvement in software quality, increase in programmer productivity, and improvement in maintainability. Software reuse does incur undeniable costs of creating, populating, and maintaining a library of reusable components. There is anecdotal evidence to suggest that some organizations benefit from reuse. However, many software developers practicing reuse claim these benefits without formal demonstration thereof. There is little research to suggest when the benefits are expected and to what extent they will be realized. For example, does a larger library of reusable components lead to increased savings? What is the impact of search effectiveness when evaluating reuse? This research seeks to address these questions. It represents the first step in a series wherein the effects of software reuse on overall development effort and costs are modeled with a view to understanding when it is most effective

    Social Media and the Public Sector

    Get PDF
    {Excerpt} Social media is revolutionizing the way we live, learn, work, and play. Elements of the private sector have begun to thrive on opportunities to forge, build, and deepen relationships. Some are transforming their organizational structures and opening their corporate ecosystems in consequence. The public sector is a relative newcomer. It too can drive stakeholder involvement and satisfaction. Global conversations, especially among Generation Y, were born circa 2004. Beginning 1995 until then, the internet had hosted static, one-way websites. These were places to visit passively, retrieve information from, and perhaps post comments about by electronic mail. Sixteen years later, Web 2.0 enables many-to-many connections in numerous domains of interest and practice, powered by the increasing use of blogs, image and video sharing, mashups, podcasts, ratings, Really Simple Syndication, social bookmarking, tweets, widgets, and wikis, among others. Today, people expect the internet to be user-centric

    Newfire\u27s Higher Education Partner Program

    Get PDF
    This paper announces the Newfire Higher Education Partners Program. It describes the program, participating institutions, and possible future development

    Shopping For Privacy: How Technology in Brick-and-Mortar Retail Stores Poses Privacy Risks for Shoppers

    Get PDF
    As technology continues to rapidly advance, the American legal system has failed to protect individual shoppers from the technology implemented into retail stores, which poses significant privacy risks but does not violate the law. In particular, I examine the technologies implemented into many brick-and-mortar stores today, many of which the average everyday shopper has no idea exists. This Article criticizes these technologies, suggesting that many, if not all of them, are questionable in their legality taking advantage of their status in a legal gray zone. Because the American judicial system cannot adequately protect the individual shopper from these questionable privacy practices, I call upon the Federal Trade Commission, the de facto privacy regulator in the United States, to increase its policing of physical retail stores to protect the shopper from any further harm

    Software Development in the Post-PC Era: Towards Software Development as a Service

    Get PDF
    Abstract. Software systems affect all aspects of our modern life andare revolutionizing the way we live. Over the years, software developmenthas evolved to meet the needs of new types of applications and toembrace new technological disruptions. Today, we witness the rise of mobilitywhere the role of the conventional high-specification PC is declining.Some refer to this era as the Post-PC era. This technological shift,powered by a key enabling technology - cloud computing, has opened new opportunities for human advancement (e.g. the Internet of Things).Consequently, the evolving landscape of software systems drives the need for new methods for conceiving them. Such methods need to a) address the challenges and requirements of this era and b) embrace the benefitsof new technological breakthroughs. In this paper, we list the characteristics of the Post-PC era from the software development perspective. In addition, we describe three motivating trends of software development processes. Then, we derive a list of requirements for the future software development approach from the characteristics of the Post-PC era and from the motivating trends. Finally, we propose a reference architecturefor cloud-based software process enactment as an enabler for Software Development as a Service (SDaaS). The architecture is thefirst step to address the needs that we have identified

    Accessing the Microscopic World

    Get PDF
    The Exploratorium in San Francisco offers museum visitors the opportunity to use and manipulate state-of-the-art microscopes to visualize an array of living specimen

    BATCH-GE : batch analysis of next-generation sequencing data for genome editing assessment

    Get PDF
    Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome

    Cyberscience and the Knowledge-Based Economy, Open Access and Trade Publishing: From Contradiction to Compatibility with Nonexclusive Copyright Licensing

    Get PDF
    Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-Science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge distribution and scientific publishing. It is argued, on the one hand, that for the academy there principally is no digital dilemma surrounding copyright and there is no contradiction between open science and the knowledge-based economy if profits are made from nonexclusive rights. On the other hand, pressure for the ‘digital doubling’ of research articles in Open Access repositories (the ‘green road’) is misguided and the current model of Open Access publishing (the ‘gold road’) has not much future outside biomedicine. Commercial publishers must understand that business models based on the transfer of copyright have not much future either. Digital technology and its economics favour the severance of distribution from certification. What is required of universities and governments, scholars and publishers, is to clear the way for digital innovations in knowledge distribution and scholarly publishing by enabling the emergence of a competitive market that is based on nonexclusive rights. This requires no change in the law but merely an end to the praxis of copyright transfer and exclusive licensing. The best way forward for research organisations, universities and scientists is the adoption of standard copyright licenses that reserve some rights, namely Attribution and No Derivative Works, but otherwise will allow for the unlimited reproduction, dissemination and re-use of the research article, commercial uses included
    • …
    corecore