640,595 research outputs found

    The evolution of a tailored communications structure : the topics system

    Get PDF
    A computer-based human communication system should be designed for people\u27s use, in response to their perceived needs and communications styles; no single system can meet the needs of all groups and individuals. It might seem that a general electronic mail or computerized conferencing system with a standard set of features should be able to meet most communications needs, in much the same way that the telephone system meets the needs of a wide range of users. However, there are many communications structures found in everyday life, ranging from one-to-many news broadcasts, to the many-to-many patterns of town meetings, from the unstructured and informal gatherings at the local pub, to highly structured meetings using Robert\u27s Rules of Order. Each of these is an example of a specific communications structure appropriate in some circumstances and quite inappropriate in others. Within a flexible computerized conferencing system such as the Electronic Information Exchange System (EIES), it is possible to tailor the features of the system to the needs of the users, rather than forcing them to adapt their communications behaviors to the system and its limitations. Current concepts and structures such as electronic mail and conferencing will be supplemented in the next decade by an ever-increasing array of specially designed structures to meet specific needs. Hiltz and Turoff (1978) discuss some of the promises and potentials for how human communication via computer will transform the ways we work, play, learn, and govern ourselves. They also discuss in some detail a variety of communications structures designed for group problem-solving and decision-making. The major question addressed here is how these communications structures evolve. How are they initiated? Where do they lead? What forces govern their evolution? For a structure to be effective, it must meet the needs of the group using it. However, the perceived needs of a group may (and probably will) change over time. This means that as a group\u27s needs change, either as it learns more about the medium or as its situation changes, the communications structure must EVOLVE to match those needs. Thus, the process of designing and implementing a communications structure becomes an ongoing process. Since it is generally recognized that the microelectronics and telecommunications wave of change we are now beginning to experience (Toffler, 1980) will transform the very fabric of our society, and since the communications procedures and structures we use in this electronic medium are going to evolve very rapidly in the next two decades, an understanding of the process of this evolution seems critical for our successful transition to a post-industrial, communications-era society. A model of the ongoing process or design of these structures is introduced in Johnson-Lenz (1980c). Included there is the concept or GROUPWARE—the integrated, systemic whole made up of a group\u27s processes and procedures, PLUS software to support those processes and procedures. Most specific software structures can be used in a variety of ways, depending on the characteristics of the group and its perceived needs for process. Thus, the system which evolves is not only the computer software but also the process and procedures followed by the group to achieve its purposes, with or without software support; hence the term GROUPWARE. This paper traces the evolution or a particular communications structure, the TOPICS system, as well as the evolution of several groups using that system, each with its own unique and evolving groupware supported by the TOPICS software, and each contributing its own unique set or needs to the evolution or the software. The TOPICS system, resident on EIES, was designed and developed by the authors, in collaboration with the groups using it

    Enforcing public data archiving policies in academic publishing: A study of ecology journals

    Full text link
    To improve the quality and efficiency of research, groups within the scientific community seek to exploit the value of data sharing. Funders, institutions, and specialist organizations are developing and implementing strategies to encourage or mandate data sharing within and across disciplines, with varying degrees of success. Academic journals in ecology and evolution have adopted several types of public data archiving policies requiring authors to make data underlying scholarly manuscripts freely available. Yet anecdotes from the community and studies evaluating data availability suggest that these policies have not obtained the desired effects, both in terms of quantity and quality of available datasets. We conducted a qualitative, interview-based study with journal editorial staff and other stakeholders in the academic publishing process to examine how journals enforce data archiving policies. We specifically sought to establish who editors and other stakeholders perceive as responsible for ensuring data completeness and quality in the peer review process. Our analysis revealed little consensus with regard to how data archiving policies should be enforced and who should hold authors accountable for dataset submissions. Themes in interviewee responses included hopefulness that reviewers would take the initiative to review datasets and trust in authors to ensure the completeness and quality of their datasets. We highlight problematic aspects of these thematic responses and offer potential starting points for improvement of the public data archiving process.Comment: 35 pages, 1 figure, 1 tabl

    Grand Challenges of Traceability: The Next Ten Years

    Full text link
    In 2007, the software and systems traceability community met at the first Natural Bridge symposium on the Grand Challenges of Traceability to establish and address research goals for achieving effective, trustworthy, and ubiquitous traceability. Ten years later, in 2017, the community came together to evaluate a decade of progress towards achieving these goals. These proceedings document some of that progress. They include a series of short position papers, representing current work in the community organized across four process axes of traceability practice. The sessions covered topics from Trace Strategizing, Trace Link Creation and Evolution, Trace Link Usage, real-world applications of Traceability, and Traceability Datasets and benchmarks. Two breakout groups focused on the importance of creating and sharing traceability datasets within the research community, and discussed challenges related to the adoption of tracing techniques in industrial practice. Members of the research community are engaged in many active, ongoing, and impactful research projects. Our hope is that ten years from now we will be able to look back at a productive decade of research and claim that we have achieved the overarching Grand Challenge of Traceability, which seeks for traceability to be always present, built into the engineering process, and for it to have "effectively disappeared without a trace". We hope that others will see the potential that traceability has for empowering software and systems engineers to develop higher-quality products at increasing levels of complexity and scale, and that they will join the active community of Software and Systems traceability researchers as we move forward into the next decade of research

    Web API Fragility: How Robust is Your Web API Client

    Full text link
    Web APIs provide a systematic and extensible approach for application-to-application interaction. A large number of mobile applications makes use of web APIs to integrate services into apps. Each Web API's evolution pace is determined by their respective developer and mobile application developers are forced to accompany the API providers in their software evolution tasks. In this paper we investigate whether mobile application developers understand and how they deal with the added distress of web APIs evolving. In particular, we studied how robust 48 high profile mobile applications are when dealing with mutated web API responses. Additionally, we interviewed three mobile application developers to better understand their choices and trade-offs regarding web API integration.Comment: Technical repor

    Grand Challenges of Traceability: The Next Ten Years

    Full text link
    In 2007, the software and systems traceability community met at the first Natural Bridge symposium on the Grand Challenges of Traceability to establish and address research goals for achieving effective, trustworthy, and ubiquitous traceability. Ten years later, in 2017, the community came together to evaluate a decade of progress towards achieving these goals. These proceedings document some of that progress. They include a series of short position papers, representing current work in the community organized across four process axes of traceability practice. The sessions covered topics from Trace Strategizing, Trace Link Creation and Evolution, Trace Link Usage, real-world applications of Traceability, and Traceability Datasets and benchmarks. Two breakout groups focused on the importance of creating and sharing traceability datasets within the research community, and discussed challenges related to the adoption of tracing techniques in industrial practice. Members of the research community are engaged in many active, ongoing, and impactful research projects. Our hope is that ten years from now we will be able to look back at a productive decade of research and claim that we have achieved the overarching Grand Challenge of Traceability, which seeks for traceability to be always present, built into the engineering process, and for it to have "effectively disappeared without a trace". We hope that others will see the potential that traceability has for empowering software and systems engineers to develop higher-quality products at increasing levels of complexity and scale, and that they will join the active community of Software and Systems traceability researchers as we move forward into the next decade of research

    A Model-Based Approach to Impact Analysis Using Model Differencing

    Get PDF
    Impact analysis is concerned with the identification of consequences of changes and is therefore an important activity for software evolution. In modelbased software development, models are core artifacts, which are often used to generate essential parts of a software system. Changes to a model can thus substantially affect different artifacts of a software system. In this paper, we propose a modelbased approach to impact analysis, in which explicit impact rules can be specified in a domain specific language (DSL). These impact rules define consequences of designated UML class diagram changes on software artifacts and the need of dependent activities such as data evolution. The UML class diagram changes are identified automatically using model differencing. The advantage of using explicit impact rules is that they enable the formalization of knowledge about a product. By explicitly defining this knowledge, it is possible to create a checklist with hints about development steps that are (potentially) necessary to manage the evolution. To validate the feasibility of our approach, we provide results of a case study.Comment: 16 pages, 5 figures, In: Proceedings of the 8th International Workshop on Software Quality and Maintainability (SQM), ECEASST Journal, vol. 65 201

    Robot life: simulation and participation in the study of evolution and social behavior.

    Get PDF
    This paper explores the case of using robots to simulate evolution, in particular the case of Hamilton's Law. The uses of robots raises several questions that this paper seeks to address. The first concerns the role of the robots in biological research: do they simulate something (life, evolution, sociality) or do they participate in something? The second question concerns the physicality of the robots: what difference does embodiment make to the role of the robot in these experiments. Thirdly, how do life, embodiment and social behavior relate in contemporary biology and why is it possible for robots to illuminate this relation? These questions are provoked by a strange similarity that has not been noted before: between the problem of simulation in philosophy of science, and Deleuze's reading of Plato on the relationship of ideas, copies and simulacra
    corecore