883 research outputs found
50 years of isolation
The traditional means for isolating applications from each other is via the use of operating system provided “process” abstraction facilities. However, as applications now consist of multiple fine-grained components, the traditional process abstraction model is proving to be insufficient in ensuring this isolation. Statistics indicate that a high percentage of software failure occurs due to propagation of component failures. These observations are further bolstered by the attempts by modern Internet browser application developers, for example, to adopt multi-process architectures in order to increase robustness. Therefore, a fresh look at the available options for isolating program components is necessary and this paper provides an overview of previous and current research on the area
Rethinking the role of users in ICT design: Reflections for the internet
This paper reports on work from an interdisciplinary project exploring the design of future telecommunications
services, networks and applications, particularly focusing on the Internet1. A starting point for this work is our
contention that users and technology are co-constructed in the design process and that in some cases it is difficult
to distinguish between designers and users since users may play a variety of roles. Some academics argue that
the future of the Internet depends on controlling and limiting what users do. Others argue that the future of the
Internet will depend on maintaining its openness and enabling users. In our project we seek to critically explore
these assertions and empirically examine the role of the user in internet design. This paper outlines some key
concepts and presents two case studies to support our approach
Recommended from our members
Experimental investigation of an interior search method within a simple framework
A steepest gradient method for solving Linear Programming (LP) problems, followed by a procedure for purifying a non-basic solution to an improved extreme point solution have been embedded within an otherwise simplex based optimiser. The algorithm is designed to be hybrid in nature and exploits many aspects of sparse matrix and revised simplex technology. The interior search step terminates at a boundary point which is usually non-basic. This is then followed by a series of minor pivotal steps which lead to a basic feasible solution with a superior objective function value. It is concluded that the procedures discussed in this paper are likely to have three possible applications, which are
(i) improving a non-basic feasible solution to a superior extreme point solution,
(iii) an improved starting point for the revised simplex method, and
(iii) an efficient implementation of the multiple price strategy of the revised simplex method
The sociology of trusted systems: the episteme and judgment of a technology (NIRSA) Working Paper Series. No.46
The goal of this paper is that of taking a first step toward a socio-technical conceptualization of trusted systems. In our view this might help in overcoming interdisciplinary differences and enhancing a common vocabulary for discussing trust issues for the Future of the Internet. In particular our main research question is to understand “to what extent and in which forms existing trusted systems embody social assumptions?” In order to answer this question we propose a new definition of Trusted Systems as situated Episteme: an apparatus of devices that set the conditions of possibility of certain practices while denying other practices. The conceptualization is augmented using the concept of technological mediation taken from the approach known as Actor-Network Theory (ANT). Our approach takes at its starting point the idea that it is possible to use sociological (from ANT) concepts to analyse and investigate the basic elements of Trusted Systems. This analysis opens up new possibilities for the sociological enquiry of Trust on a more micro, socio-technical level. In particular the paper puts forward the idea of Trust as result of the system design
The sociology of trusted systems: the episteme and judgment of a technology (NIRSA) Working Paper Series. No.46
The goal of this paper is that of taking a first step toward a socio-technical conceptualization of trusted systems. In our view this might help in overcoming interdisciplinary differences and enhancing a common vocabulary for discussing trust issues for the Future of the Internet. In particular our main research question is to understand “to what extent and in which forms existing trusted systems embody social assumptions?” In order to answer this question we propose a new definition of Trusted Systems as situated Episteme: an apparatus of devices that set the conditions of possibility of certain practices while denying other practices. The conceptualization is augmented using the concept of technological mediation taken from the approach known as Actor-Network Theory (ANT). Our approach takes at its starting point the idea that it is possible to use sociological (from ANT) concepts to analyse and investigate the basic elements of Trusted Systems. This analysis opens up new possibilities for the sociological enquiry of Trust on a more micro, socio-technical level. In particular the paper puts forward the idea of Trust as result of the system design
Recommended from our members
A tree search approach for the solution of set problems using alternative relaxations
A number of alternative relaxations for the family of set problems (FSP) in general and set covering problems (SCP) in particular are introduced and discussed. These are (i) Network flow relaxation, (ii) Assignment relaxation, (iii) Shortest route relaxation, (iv) Minimum spanning tree relaxation. A unified tree search method is developed which makes use of these relaxations. Computational experience of processing a collection of test problems is reported
Garbage Collection in a Very Large Address Space
This research was done at the Artificial Intelligence Laboratory of the Massachusetts Institute of Technology and was supported by the Office of Naval Research under contract number N00014-75-C-0522.The address space is broken into areas that can be garbage collected separately. An area is analogous to a file on current systems. Each process has a local computation area for its stack and temporary storage that is roughly analogous to a job core image. A mechanism is introduced for maintaining lists of inter-area links, the key to separate garbage collection. This mechanism is designed to be placed in hardware and does not create much overhead. It could be used in a practical computer system that uses the same address space for all users for the life of the system. It is necessary for the hardware to implement a reference count scheme that is adequate for handling stack frames. The hardware also facilitates implementation of protection by capabilities without the use of unique codes. This is due to elimination of dangling references. Areas can be deleted without creating dangling references.MIT Artificial Intelligence Laboratory
Department of Defense Office of Naval Researc
Transcending POSIX: The End of an Era?
In this article, we provide a holistic view of the Portable Operating System Interface (POSIX) abstractions by a systematic review of their historical evolution. We discuss some of the key factors that drove the evolution and identify the pitfalls that make them infeasible when building modern applications.Peer reviewe
Major Trends in Operating Systems Development
Operating systems have changed in nature in response to demands of users, and in response to advances in hardware and software technology. The purpose of this paper is to trace the development of major themes in operating system design from their beginnings through the present. This is not an exhaustive history of operating systems, but instead is intended to give the reader the flavor of the dif ferent periods in operating systems\u27 development. To this end, the paper will be organized by topic in approximate order of development. Each chapter will start with an introduction to the factors behind the rise of the period. This will be fol lowed by a survey of the state-of-the-art systems, and the conditions influencing them. The chapters close with a summation of the significant hardware and software contributions from the period
Maclisp extensions
A common subset of selected facilities available in Maclisp and its derivatives (PDP-10 and Multics Maclisp, Lisp Machine Lisp (Zetalisp), and NIL) is decribed. The object is to add in writing code which can run compatibly in more than one of these environments
- …