16 research outputs found

    Internal governance of the IETF, W3C and IEEE: structure, decision-making and internationalisation

    Get PDF
    This is the author accepted manuscript. The final version is available from Oxford University Press via the ISBN and DOI in this recordNOTE: the title of the author accepted manuscript available in this record is slightly different from the chapter published onlineChapter 3 explains the internal organization of SDOs. It outlines the four main layers which make up the Internet and focuses on the principal SDOs associated with them: the IETF, W3C, the Organization for the Advancement of Structured Information Standards (OASIS), and IEEE, analysing their governance structures and most salient areas of work. Standards are important for interoperability and to prevent lock-in to single company technology. SDOs enable a wider technical community to scrutinize proposals for errors and security. Nonetheless, procedures for decision-making are complex and often opaque. The chapter explores decision-making in a governance context. It charts the evolution of each SDO, and its main purpose, functions, and central decision-making processes. It highlights the differences as well as the degree of synergy and collaboration between each and then explains how procedures vary between the different fora and how moving goalposts and high barriers to entry make it difficult for civil society to participate.Economic and Social Research Council (ESRC

    Adaption biotechnologischer Prozesse fĂĽr die interaktive Umsetzung in digital unterstĂĽtzter Laborumgebung, Entwicklung und Evaluation

    Get PDF
    Die vorliegende Arbeit beschäftigt sich mit der digitalen Transformation des biotechnologischen Labors. Ziel der Bemühungen ist, einen Mehrwert sowohl in der Qualität der erzeugten Ergebnisse, als auch in den Arbeitsabläufen für den Wissenschaftler oder Laboranten zu schaffen. Besonderes Augenmerk liegt dabei auf der Interaktion des digitalisierten Laborsystems mit dem Benutzer. Es wird beschrieben, welche Maßnahmen notwendig sind, um ein solches Laborsystem aufbauen und betreiben zu können. Neben den Hardware-Voraussetzungen für den Betrieb wird insbesondere auf die standardisierte Anbindung von Laborgeräten eingegangen. Da momentan noch kein allgemeiner Kommunikationsstandard in diesem Bereich existiert, werden Mittel und Wege aufgezeigt, wie eine umfassende Integration trotzdem möglich ist. Dazu wurde u. a. ein Hardware-Modul zur Anbindung von Alt- oder Bestandsgeräten entwickelt und evaluiert. Um die Interaktion des Menschen mit dem Laborsystem so unkompliziert wie möglich zu gestalten, werden verschiedene Formen der Nutzerinteraktion (Sprachsteuerung, Head-Mounted-Displays, etc.) untersucht und vorgestellt. Das gesamte System ist so ausgelegt, dass die Steuerung von zentraler Stelle aus möglich ist und alle Daten in einem Laborserver zusammenfließen. Diese Architektur schafft die Voraussetzungen für eine generische und unkomplizierte Anbindung verschiedenster Nutzerinteraktionsgeräte. Im Rahmen der Arbeiten wurde weiterhin ein Prozessleitsystem entwickelt, dass die Formulierung und Durchführung von Arbeitsabläufen im digitalisierten Labor einfach und intuitiv ermöglicht. Dabei sollen Implementierungsdetails und konkrete Probleme der Geräteanbindung abstrahiert werden und die Abläufe möglichst unabhängig von den konkreten Modellen der Geräte werden, die zu ihrer Durchführung notwendig sind. Durch die zentrale Geräteanbindung und Datenverarbeitung werden Möglichkeiten für Abläufe geschaffen, die die FAIR (Findable, Accessible, Interoperable, Reusable)-Richtlinien zum Umgang mit wissenschaftlichen Daten einhalten. Das digitalisierte Labor wurde für die Durchführung von Beispielabläufen und Nutzerinteraktionsstudien verwendet und steht voll funktional zur Verfügung. Die Software und Systemarchitekturen wurden in verschiedenen wissenschaftlichen Veröffentlichungen dokumentiert und viele Komponenten sind unter offenen Lizenzen verfügbar.This thesis deals with the digital transformation of the biotechnological laboratory. The goal of the efforts is to create benefit for the scientist or laboratory technician, both in the quality of the results produced and in the work processes. Special attention is paid to the interaction of the digitized laboratory system with the user. It is described which measures are necessary to set up and operate such a laboratory system. In addition to the hardware requirements for operation, the standardized connection of laboratory devices is discussed in particular. Since there is currently no general communication standard in this area, ways and means are shown how a comprehensive integration is still possible. For this purpose, a hardware module for the connection of old or existing devices was developed and evaluated. In order to make human interaction with the laboratory system as uncomplicated as possible, different forms of user interaction (voice control, head-mounted displays, etc.) are presented and evaluated. The entire system is designed in such a way that the control is possible from a central location and all data are merged in a laboratory server. This architecture creates the conditions for a generic and uncomplicated connection of various user interaction devices. Within the scope of the work, also a process control system was developed enabling the simple and intuitive formulation and execution of workflows in the digitalized laboratory. Implementation details and specific problems of device connectivity are abstracted from the processes themselves, so that they become as independent as possible from the specific types of devices necessary for their execution. The central device connection and data processing leverages the creation of processes that comply with the FAIR (Findable, Accessible, Interoperable, Reusable) guidelines for handling scientific data. The digitized laboratory has been used for the execution of sample procedures and user interaction studies and is fully functional. The software and system architectures have been documented in various scientific publications and many components are available under open licenses

    Blown to Bits: Your Life, Liberty, and Happiness After the Digital Explosion

    Get PDF
    382 p.Libro ElectrónicoEach of us has been in the computing field for more than 40 years. The book is the product of a lifetime of observing and participating in the changes it has brought. Each of us has been both a teacher and a learner in the field. This book emerged from a general education course we have taught at Harvard, but it is not a textbook. We wrote this book to share what wisdom we have with as many people as we can reach. We try to paint a big picture, with dozens of illuminating anecdotes as the brushstrokes. We aim to entertain you at the same time as we provoke your thinking.Preface Chapter 1 Digital Explosion Why Is It Happening, and What Is at Stake? The Explosion of Bits, and Everything Else The Koans of Bits Good and Ill, Promise and Peril Chapter 2 Naked in the Sunlight Privacy Lost, Privacy Abandoned 1984 Is Here, and We Like It Footprints and Fingerprints Why We Lost Our Privacy, or Gave It Away Little Brother Is Watching Big Brother, Abroad and in the U.S. Technology Change and Lifestyle Change Beyond Privacy Chapter 3 Ghosts in the Machine Secrets and Surprises of Electronic Documents What You See Is Not What the Computer Knows Representation, Reality, and Illusion Hiding Information in Images The Scary Secrets of Old Disks Chapter 4 Needles in the Haystack Google and Other Brokers in the Bits Bazaar Found After Seventy Years The Library and the Bazaar The Fall of Hierarchy It Matters How It Works Who Pays, and for What? Search Is Power You Searched for WHAT? Tracking Searches Regulating or Replacing the Brokers Chapter 5 Secret Bits How Codes Became Unbreakable Encryption in the Hands of Terrorists, and Everyone Else Historical Cryptography Lessons for the Internet Age Secrecy Changes Forever Cryptography for Everyone Cryptography Unsettled Chapter 6 Balance Toppled Who Owns the Bits? Automated Crimes—Automated Justice NET Act Makes Sharing a Crime The Peer-to-Peer Upheaval Sharing Goes Decentralized Authorized Use Only Forbidden Technology Copyright Koyaanisqatsi: Life Out of Balance The Limits of Property Chapter 7 You Can’t Say That on the Internet Guarding the Frontiers of Digital Expression Do You Know Where Your Child Is on the Web Tonight? Metaphors for Something Unlike Anything Else Publisher or Distributor? Neither Liberty nor Security The Nastiest Place on Earth The Most Participatory Form of Mass Speech Protecting Good Samaritans—and a Few Bad Ones Laws of Unintended Consequences Can the Internet Be Like a Magazine Store? Let Your Fingers Do the Stalking Like an Annoying Telephone Call? Digital Protection, Digital Censorship—and Self-Censorship Chapter 8 Bits in the Air Old Metaphors, New Technologies, and Free Speech Censoring the President How Broadcasting Became Regulated The Path to Spectrum Deregulation What Does the Future Hold for Radio? Conclusion After the Explosion Bits Lighting Up the World A Few Bits in Conclusion Appendix The Internet as System and Spirit The Internet as a Communication System The Internet Spirit Endnotes Inde

    European Information Technology Observatory 1997

    Get PDF

    Bandwidth management and monitoring for IP network traffic : an investigation

    Get PDF
    Bandwidth management is a topic which is often discussed, but on which relatively little work has been done with regard to compiling a comprehensive set of techniques and methods for managing traffic on a network. What work has been done has concentrated on higher end networks, rather than the low bandwidth links which are commonly available in South Africa and other areas outside the United States. With more organisations increasingly making use of the Internet on a daily basis, the demand for bandwidth is outstripping the ability of providers to upgrade their infrastructure. This resource is therefore in need of management. In addition, for Internet access to become economically viable for widespread use by schools, NGOs and other academic institutions, the associated costs need to be controlled. Bandwidth management not only impacts on direct cost control, but encompasses the process of engineering a network and network resources in order to ensure the provision of as optimal a service as possible. Included in this is the provision of user education. Software has been developed for the implementation of traffic quotas, dynamic firewalling and visualisation. The research investigates various methods for monitoring and management of IP traffic with particular applicability to low bandwidth links. Several forms of visualisation for the analysis of historical and near-realtime traffic data are also discussed, including the use of three-dimensional landscapes. A number of bandwidth management practices are proposed, and the advantages of their combination, and complementary use are highlighted. By implementing these suggested policies, a holistic approach can be taken to the issue of bandwidth management on Internet links

    Standards as interdependent artifacts : the case of the Internet

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Engineering Systems Division, 2008.Includes bibliographical references.This thesis has explored a new idea: viewing standards as interdependent artifacts and studying them with network analysis tools. Using the set of Internet standards as an example, the research of this thesis includes the citation network, the author affiliation network, and the co-author network of the Internet standards over the period of 1989 to 2004. The major network analysis tools used include cohesive subgroup decomposition (the algorithm by Newman and Girvan is used), regular equivalence class decomposition (the REGE algorithm and the method developed in this thesis is used), nodal prestige and acquaintance (both calculated from Kleinberg's technique), and some social network analysis tools. Qualitative analyses of the historical and technical context of the standards as well as statistical analyses of various kinds are also used in this research. A major finding of this thesis is that for the understanding of the Internet, it is beneficial to consider its standards as interdependent artifacts. Because the basic mission of the Internet (i.e. to be an interoperable system that enables various services and applications) is enabled, not by one or a few, but by a great number of standards developed upon each other, to study the standards only as stand-alone specifications cannot really produce meaningful understandings about a workable system. Therefore, the general approaches and methodologies introduced in this thesis which we label a systems approach is a necessary addition to the existing approaches. A key finding of this thesis is that the citation network of the Internet standards can be decomposed into functionally coherent subgroups by using the Newman-Girvan algorithm.(cont.) This result shows that the (normative) citations among the standards can meaningfully be used to help us better manage and monitor the standards system. The results in this thesis indicate that organizing the developing efforts of the Internet standards into (now) 121 Working Groups was done in a manner reasonably consistent with achieving a modular (and thus more evolvable) standards system. A second decomposition of the standards network was achieved by employing the REGE algorithm together with a new method developed in this thesis (see the Appendix) for identifying regular equivalence classes. Five meaningful subgroups of the Internet standards were identified, and each of them occupies a specific position and plays a specific role in the network. The five positions are reflected in the names we have assigned to them: the Foundations, the Established, the Transients, the Newcomers, and the Stand-alones. The life cycle among these positions was uncovered and is one of the insights that the systems approach on this standard system gives relative to the evolution of the overall standards system. Another insight concerning evolution of the standard system is the development of a predictive model for promotion of standards to a new status (i.e. Proposed, Draft and Internet Standards as the three ascending statuses). This model also has practical potential to managers of standards setting organizations and to firms (and individuals) interested in efficiently participating in standards setting processes. The model prediction is based on assessing the implicit social influence of the standards (based upon the social network metric, betweenness centrality, of the standards' authors) and the apparent importance of the standard to the network (based upon calculating the standard's prestige from the citation network).(cont.) A deeper understanding of the factors that go into this model was also developed through the analysis of the factors that can predict increased prestige over time for a standard. The overall systems approach and the tools developed and demonstrated in this thesis for the study of the Internet standards can be applied to other standards systems. Application (and extension) to the World Wide Web, electric power system, mobile communication, and others would we believe lead to important improvements in our practical and scholarly understanding of these systems.by Mo-Han Hsieh.Ph.D
    corecore