350 research outputs found

    Software dependability in the Tandem GUARDIAN system

    Get PDF
    Based on extensive field failure data for Tandem's GUARDIAN operating system this paper discusses evaluation of the dependability of operational software. Software faults considered are major defects that result in processor failures and invoke backup processes to take over. The paper categorizes the underlying causes of software failures and evaluates the effectiveness of the process pair technique in tolerating software faults. A model to describe the impact of software faults on the reliability of an overall system is proposed. The model is used to evaluate the significance of key factors that determine software dependability and to identify areas for improvement. An analysis of the data shows that about 77% of processor failures that are initially considered due to software are confirmed as software problems. The analysis shows that the use of process pairs to provide checkpointing and restart (originally intended for tolerating hardware faults) allows the system to tolerate about 75% of reported software faults that result in processor failures. The loose coupling between processors, which results in the backup execution (the processor state and the sequence of events) being different from the original execution, is a major reason for the measured software fault tolerance. Over two-thirds (72%) of measured software failures are recurrences of previously reported faults. Modeling, based on the data, shows that, in addition to reducing the number of software faults, software dependability can be enhanced by reducing the recurrence rate

    Software fault tolerance in computer operating systems

    Get PDF
    This chapter provides data and analysis of the dependability and fault tolerance for three operating systems: the Tandem/GUARDIAN fault-tolerant system, the VAX/VMS distributed system, and the IBM/MVS system. Based on measurements from these systems, basic software error characteristics are investigated. Fault tolerance in operating systems resulting from the use of process pairs and recovery routines is evaluated. Two levels of models are developed to analyze error and recovery processes inside an operating system and interactions among multiple instances of an operating system running in a distributed environment. The measurements show that the use of process pairs in Tandem systems, which was originally intended for tolerating hardware faults, allows the system to tolerate about 70% of defects in system software that result in processor failures. The loose coupling between processors which results in the backup execution (the processor state and the sequence of events occurring) being different from the original execution is a major reason for the measured software fault tolerance. The IBM/MVS system fault tolerance almost doubles when recovery routines are provided, in comparison to the case in which no recovery routines are available. However, even when recovery routines are provided, there is almost a 50% chance of system failure when critical system jobs are involved

    Experimental analysis of computer system dependability

    Get PDF
    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance

    Measurement and Analysis of Operating System Fault Tolerance

    Get PDF
    Coordinated Science Laboratory was formerly known as Control Systems LaboratoryONR / N00014-91-J-1116NASA / NAG-1-61

    The Lawyer\u27s Cryptionary: A Resource for Talking to Clients About Crypto-Transactions

    Get PDF

    Castration causes an increase in lysosomal size and upregulation of cathepsin D expression in principal cells along with increased secretion of procathepsin D and prosaposin oligomers in adult rat epididymis

    Get PDF
    In the epididymis, lysosomal proteins of the epithelial cells are normally targeted from the Golgi apparatus to lysosomes for degradation, although their secretion into the epididymal lumen has been documented and associated with sperm maturation. In this study, cathepsin D (CatD) and prosaposin (PSAP) were examined in adult epididymis of control, and 2-day castrated rats without (Ct) and with testosterone replacement (Ct+T) to evaluate their expression and regulation within epididymal epithelial cells. By light microscope-immunocytochemistry, a quantitative increase in size of lysosomes in principal cells of Ct animals was noted from the distal initial segment to the proximal cauda. Androgen replacement did not restore the size of lysosomes to control levels. Western blot analysis revealed a significant increase in CatD expression in the epididymis of Ct animals, which suggested an upregulation of its expression in principal cells; androgens restored levels of CatD to that of controls. In contrast, PSAP expression in Ct animals was not altered from controls. Additionally, an increase in procathepsin D levels was noted from samples of the epididymal fluid of Ct compared to control animals, accompanied by an increased complex formation with PSAP. Moreover, an increased oligomerization of prosaposin was observed in the epididymal lumen of Ct rats, with changes reverted to controls in Ct+T animals. Taken together these data suggest castration causes an increased uptake of substrates that are acted upon by CatD in lysosomes of principal cells and in the lumen by procathepsin D. These substrates may be derived from apoptotic cells noted in the lumen of proximal regions and possibly by degenerating sperm in distal regions of the epididymis of Ct animals. Exploring the mechanisms by which lysosomal enzymes are synthesized and secreted by the epididymis may help resolve some of the issues originating from epididymal dysfunctions with relevance to sperm maturation.Fil: Carvelli, Flavia Lorena. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Mendoza. Instituto de Histología y Embriología de Mendoza Dr. Mario H. Burgos. Universidad Nacional de Cuyo. Facultad de Ciencias Médicas. Instituto de Histología y Embriología de Mendoza Dr. Mario H. Burgos; Argentina. Universidad Nacional de Cuyo. Facultad de Ciencias Exactas y Naturales; ArgentinaFil: Aguilera, Andrea Carolina. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Mendoza. Instituto de Histología y Embriología de Mendoza Dr. Mario H. Burgos. Universidad Nacional de Cuyo. Facultad de Ciencias Médicas. Instituto de Histología y Embriología de Mendoza Dr. Mario H. Burgos; Argentina. Universidad Nacional de Cuyo. Facultad de Ciencias Exactas y Naturales; ArgentinaFil: Zyla, Leila Ester. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Mendoza. Instituto de Histología y Embriología de Mendoza Dr. Mario H. Burgos. Universidad Nacional de Cuyo. Facultad de Ciencias Médicas. Instituto de Histología y Embriología de Mendoza Dr. Mario H. Burgos; Argentina. Universidad Nacional de Cuyo. Facultad de Ciencias Exactas y Naturales; ArgentinaFil: Pereyra, Laura Lucia. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Mendoza. Instituto de Histología y Embriología de Mendoza Dr. Mario H. Burgos. Universidad Nacional de Cuyo. Facultad de Ciencias Médicas. Instituto de Histología y Embriología de Mendoza Dr. Mario H. Burgos; ArgentinaFil: Morales, Carlos R.. McGill University; CanadáFil: Hermo, Louis. McGill University; CanadáFil: Sosa Escudero, Miguel Angel. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Mendoza. Instituto de Histología y Embriología de Mendoza Dr. Mario H. Burgos. Universidad Nacional de Cuyo. Facultad de Ciencias Médicas. Instituto de Histología y Embriología de Mendoza Dr. Mario H. Burgos; Argentina. Universidad Nacional de Cuyo. Facultad de Ciencias Exactas y Naturales; Argentin

    Accelerated Culture: Exploring Time and Space in Cinema, Television and New Media in the Digital Age

    Get PDF
    This dissertation seeks to understand the impact of speed on the interrelation and the overlapping of the production and consumption of cinematic and televisual texts. It explores the immediacy of digital media and new economic processes, and how they are informing structures of perception, as well as lending themselves to new and different ways of seeing the moving image in the digital age. These visual expressions are evident in the changing perception of the long take; the increasing use of video gaming aesthetics and database narratives; new and variant forms of narrative and visual styles in television; and the speed of new media technology on new voices and avant-garde expressions in independent and DIY cinema (such as the Internet, personal camcorder, mobile screens, and desktop editing). Conversely, VCR, DVD, DVR devices (as well as online streaming and DVD and Blu-Ray rental sites) have transformed the consumption of the moving image. Time-shifting devices allow for halting and controlling the flow of passing time, permitting for greater textual analysis. And, reciprocally, these new perceptions of the moving image inform expressions of filmic time and space. The speed of digital media and new economic formations raise concerns about lived reality and the attenuation of time, place, and community. It brings forth questions of the waning of pastness and memory, the diminishing of critical distance, and the vanishing of slow time. I argue, however, these shifts that are occurring in cinema and television illustrate that processes of speed are not the prime determinant in the production and consumption of moving images. Rather, they are based on a contingent and open-ended model of articulation--sites where disparate elements are temporary combined, unified, and thus, practiced and lived under the ever-changing conditions of existence

    Unmanned Aircraft Systems in the Cyber Domain

    Get PDF
    Unmanned Aircraft Systems are an integral part of the US national critical infrastructure. The authors have endeavored to bring a breadth and quality of information to the reader that is unparalleled in the unclassified sphere. This textbook will fully immerse and engage the reader / student in the cyber-security considerations of this rapidly emerging technology that we know as unmanned aircraft systems (UAS). The first edition topics covered National Airspace (NAS) policy issues, information security (INFOSEC), UAS vulnerabilities in key systems (Sense and Avoid / SCADA), navigation and collision avoidance systems, stealth design, intelligence, surveillance and reconnaissance (ISR) platforms; weapons systems security; electronic warfare considerations; data-links, jamming, operational vulnerabilities and still-emerging political scenarios that affect US military / commercial decisions. This second edition discusses state-of-the-art technology issues facing US UAS designers. It focuses on counter unmanned aircraft systems (C-UAS) – especially research designed to mitigate and terminate threats by SWARMS. Topics include high-altitude platforms (HAPS) for wireless communications; C-UAS and large scale threats; acoustic countermeasures against SWARMS and building an Identify Friend or Foe (IFF) acoustic library; updates to the legal / regulatory landscape; UAS proliferation along the Chinese New Silk Road Sea / Land routes; and ethics in this new age of autonomous systems and artificial intelligence (AI).https://newprairiepress.org/ebooks/1027/thumbnail.jp

    Approaching algorithmic power

    Get PDF
    Contemporary power manifests in the algorithmic. Emerging quite recently as an object of study within media and communications, cultural research, gender and race studies, and urban geography, the algorithm often seems ungraspable. Framed as code, it becomes proprietary property, black-boxed and inaccessible. Framed as a totality, its becomes overwhelmingly complex, incomprehensible in its operations. Framed as a procedure, it becomes a technique to be optimised, bracketing out the political. In struggling to adequately grasp the algorithmic as an object of study, to unravel its mechanisms and materialities, these framings offer limited insight into how algorithmic power is initiated and maintained. This thesis instead argues for an alternative approach: firstly, that the algorithmic is coordinated by a coherent internal logic, a knowledge-structure that understands the world in particular ways; second, that the algorithmic is enacted through control, a material and therefore observable performance which purposively influences people and things towards a predetermined outcome; and third, that this complex totality of architectures and operations can be productively analysed as strategic sociotechnical clusters of machines. This method of inquiry is developed with and tested against four contemporary examples: Uber, Airbnb, Amazon Alexa, and Palantir Gotham. Highly profitable, widely adopted and globally operational, they exemplify the algorithmic shift from whiteboard to world. But if the world is productive, it is also precarious, consisting of frictional spaces and antagonistic subjects. Force cannot be assumed as unilinear, but is incessantly negotiated—operations of parsing data and processing tasks forming broader operations that strive to establish subjectivities and shape relations. These negotiations can fail, destabilised by inadequate logics and weak control. A more generic understanding of logic and control enables a historiography of the algorithmic. The ability to index information, to structure the flow of labor, to exert force over subjects and spaces— these did not emerge with the microchip and the mainframe, but are part of a longer lineage of calculation. Two moments from this lineage are examined: house-numbering in the Habsburg Empire and punch-card machines in the Third Reich. Rather than revolutionary, this genealogy suggests an evolutionary process, albeit uneven, linking the computation of past and present. The thesis makes a methodological contribution to the nascent field of algorithmic studies. But more importantly, it renders algorithmic power more intelligible as a material force. Structured and implemented in particular ways, the design of logic and control construct different versions, or modalities, of algorithmic power. This power is political, it calibrates subjectivities towards certain ends, it prioritises space in specific ways, and it privileges particular practices whilst suppressing others. In apprehending operational logics, the practice of method thus foregrounds the sociopolitical dimensions of algorithmic power. As the algorithmic increasingly infiltrates into and governs the everyday, the ability to understand, critique, and intervene in this new field of power becomes more urgent
    corecore