22 research outputs found

    Operating the Lisp Machine

    Get PDF
    This document is a draft copy of a portion of the Lisp Machine window system manual. It is being published in this form now to make it available, since the complete window system manual is unlikely to be finished in the near future. The information in this document is accurate as of system 67, but is not guaranteed to remain 100% accurate. This document explains how to use the Lisp Machine from a non-programmer's point of view. It explains the general characteristics of the user interface, particularly the window system and the program-control commands. This document is intended to tell you everything you need to know to sit down at a Lisp machine and run programs, but does not deal with the writing of programs. Many arcane commands and user-interface features are also documented herein, although the beginning user can safely ignore them.MIT Artificial Intelligence Laborator

    Assigned numbers

    Full text link

    The Novice's Guide to the UNIX at the AI Laboratory Version 1.0

    Get PDF
    This is a manual for complete beginners. It requires little knowledge of the MIT computer systems, and assumes no knowledge of the UNIX operating system. This guide will show you how to log onto the AI Lab's SUN system using a SUN III or similar workstation or a non-dedicated terminal. Many of the techniques described will be applicable to other computers running UNIX. You will learn how to use various operating system and network features, send and receive electronic mail, create and edit files using GNU EMACS, process text using YTEX, and print your files.MIT Artificial Intelligence Laborator

    Mosh: An Interactive Remote Shell for Mobile Clients

    Get PDF
    Mosh (mobile shell) is a remote terminal application that supports intermittent connectivity, allows roaming, and speculatively and safely echoes user keystrokes for better interactive response over high-latency paths. Mosh is built on the State Synchronization Protocol (SSP), a new UDP-based protocol that securely synchronizes client and server state, even across changes of the client’s IP address. Mosh uses SSP to synchronize a character-cell terminal emulator, maintaining terminal state at both client and server to predictively echo keystrokes. Our evaluation analyzed keystroke traces from six different users covering a period of 40 hours of real-world usage. Mosh was able to immediately display the effects of 70% of the user keystrokes. Over a commercial EV-DO (3G) network, median keystroke response latency with Mosh was less than 5 ms, compared with 503 ms for SSH. Mosh is free software, available from http://mosh.mit.edu. It was downloaded more than 15,000 times in the first week of its release.National Science Foundation (U.S.) (NSF grant 1040072)National Science Foundation (U.S.) (NSF grant 0721702

    Copyright Termination And Technical Standards

    Get PDF
    Technical standards, which enable products manufactured by different vendors to work together, form the basis of the modem technological infrastructure. Yet an obscure provision of the U.S. Copyright Act, enacted to allow authors and composers to profit from the later success of their works, now threatens to disrupt this critical technological ecosystem. Enacted in 1976, Section 203 of the Copyright Act permits the author of a copyrighted work to revoke any copyright license or assignment between thirty-five and forty years after the grant was made. For grants made in 1978, the first year to which Section 203 applies, terminations could first be made in 2013, and in the music and publishing industries such terminations, and the concomitant litigation, have already begun. Technical standards are also treated as copyrightable works, and arguably the provisions of Section 203 apply to them. Numerous standards published in 1978 are still in use, and each year the number of standards potentially subject to Section 203 termination will grow. But unlike the composers and authors whom Section 203 was intended to protect, contributors to technical standards are usually engineers employed by large corporations, research institutions, or government agencies who make such contributions without additional compensation. Standards are thus unburdened by the copyright royalty obligations that characterize musical compositions, books, and other works of authorship. The termination of customary royalty-free copyright licenses granted by contributors to standards organizations or their heirs could thus have a significant disruptive effect on the standardization process and impose a substantial new cost on industries that are standards-dependent (a cost most likely to be passed through to consumers). The application of Section 203 to technical standards, however, is not straightforward. This article, for the first time, assesses Section 203 in terms of its applicability to technical standards documents. In particular, it analyzes considerations of joint authorship, works-made-for-hire, and derivative works under Section 203 to an area that was clearly not contemplated by Congress when it enacted the statute. We conclude that, although Section 203 is theoretically applicable to technical standards, several statutory obstacles would impede the wholesale termination of standards-related license grants. Nevertheless, in order to avoid costly and time-consuming litigation, we recommend that Congress or the courts explicitly acknowledge the inapplicability of Section 203 to technical standards

    Copyright Termination And Technical Standards

    Get PDF
    Technical standards, which enable products manufactured by different vendors to work together, form the basis of the modem technological infrastructure. Yet an obscure provision of the U.S. Copyright Act, enacted to allow authors and composers to profit from the later success of their works, now threatens to disrupt this critical technological ecosystem. Enacted in 1976, Section 203 of the Copyright Act permits the author of a copyrighted work to revoke any copyright license or assignment between thirty-five and forty years after the grant was made. For grants made in 1978, the first year to which Section 203 applies, terminations could first be made in 2013, and in the music and publishing industries such terminations, and the concomitant litigation, have already begun. Technical standards are also treated as copyrightable works, and arguably the provisions of Section 203 apply to them. Numerous standards published in 1978 are still in use, and each year the number of standards potentially subject to Section 203 termination will grow. But unlike the composers and authors whom Section 203 was intended to protect, contributors to technical standards are usually engineers employed by large corporations, research institutions, or government agencies who make such contributions without additional compensation. Standards are thus unburdened by the copyright royalty obligations that characterize musical compositions, books, and other works of authorship. The termination of customary royalty-free copyright licenses granted by contributors to standards organizations or their heirs could thus have a significant disruptive effect on the standardization process and impose a substantial new cost on industries that are standards-dependent (a cost most likely to be passed through to consumers). The application of Section 203 to technical standards, however, is not straightforward. This article, for the first time, assesses Section 203 in terms of its applicability to technical standards documents. In particular, it analyzes considerations of joint authorship, works-made-for-hire, and derivative works under Section 203 to an area that was clearly not contemplated by Congress when it enacted the statute. We conclude that, although Section 203 is theoretically applicable to technical standards, several statutory obstacles would impede the wholesale termination of standards-related license grants. Nevertheless, in order to avoid costly and time-consuming litigation, we recommend that Congress or the courts explicitly acknowledge the inapplicability of Section 203 to technical standards

    Comparing a Hybrid Multi-layered Machine Learning Intrusion Detection System to Single-layered and Deep Learning Models

    Get PDF
    Advancements in computing technology have created additional network attack surface, allowed the development of new attack types, and increased the impact caused by an attack. Researchers agree, current intrusion detection systems (IDSs) are not able to adapt to detect these new attack forms, so alternative IDS methods have been proposed. Among these methods are machine learning-based intrusion detection systems. This research explores the current relevant studies related to intrusion detection systems and machine learning models and proposes a new hybrid machine learning IDS model consisting of the Principal Component Analysis (PCA) and Support Vector Machine (SVM) learning algorithms. The NSL-KDD Dataset, benchmark dataset for IDSs, is used for comparing the models’ performance. The performance accuracy and false-positive rate of the hybrid model are compared to the results of the model’s individual algorithmic components to determine which components most impact attack prediction performance. The performance metrics of the hybrid model are also compared to two deep learning Autoencoder Neuro Network models and the results found that the complexity of the model does not add to the performance accuracy. The research showed that pre-processing and feature selection impact the predictive accuracy across models. Future research recommendations were to implement the proposed hybrid IDS model into a live network for testing and analysis, and to focus research into the pre-processing algorithms that improve performance accuracy, and lower false-positive rate. This research indicated that pre-processing and feature selection/feature extraction can increase model performance accuracy and decrease false-positive rate helping businesses to improve network security

    The New Idiot's Guide to OZ

    Get PDF
    This is a manual for complete beginners. It assumes no knowledge of the MIT computer systems. This guide will teach you how to log onto the computer called OZ, a DEC PDP-20 computer running the TWENEX (TOPS-20) operating system. You will learn how to use various operating system features, send and receive electronic mail, create and edit files using EMACS, process text using YTEX, and print out your files. This manual has a companion on-line directory on OZ, called , which contains sample programs and examples to use in conjunction with this guide.MIT Artificial Intelligence Laborator

    Standards as interdependent artifacts : the case of the Internet

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Engineering Systems Division, 2008.Includes bibliographical references.This thesis has explored a new idea: viewing standards as interdependent artifacts and studying them with network analysis tools. Using the set of Internet standards as an example, the research of this thesis includes the citation network, the author affiliation network, and the co-author network of the Internet standards over the period of 1989 to 2004. The major network analysis tools used include cohesive subgroup decomposition (the algorithm by Newman and Girvan is used), regular equivalence class decomposition (the REGE algorithm and the method developed in this thesis is used), nodal prestige and acquaintance (both calculated from Kleinberg's technique), and some social network analysis tools. Qualitative analyses of the historical and technical context of the standards as well as statistical analyses of various kinds are also used in this research. A major finding of this thesis is that for the understanding of the Internet, it is beneficial to consider its standards as interdependent artifacts. Because the basic mission of the Internet (i.e. to be an interoperable system that enables various services and applications) is enabled, not by one or a few, but by a great number of standards developed upon each other, to study the standards only as stand-alone specifications cannot really produce meaningful understandings about a workable system. Therefore, the general approaches and methodologies introduced in this thesis which we label a systems approach is a necessary addition to the existing approaches. A key finding of this thesis is that the citation network of the Internet standards can be decomposed into functionally coherent subgroups by using the Newman-Girvan algorithm.(cont.) This result shows that the (normative) citations among the standards can meaningfully be used to help us better manage and monitor the standards system. The results in this thesis indicate that organizing the developing efforts of the Internet standards into (now) 121 Working Groups was done in a manner reasonably consistent with achieving a modular (and thus more evolvable) standards system. A second decomposition of the standards network was achieved by employing the REGE algorithm together with a new method developed in this thesis (see the Appendix) for identifying regular equivalence classes. Five meaningful subgroups of the Internet standards were identified, and each of them occupies a specific position and plays a specific role in the network. The five positions are reflected in the names we have assigned to them: the Foundations, the Established, the Transients, the Newcomers, and the Stand-alones. The life cycle among these positions was uncovered and is one of the insights that the systems approach on this standard system gives relative to the evolution of the overall standards system. Another insight concerning evolution of the standard system is the development of a predictive model for promotion of standards to a new status (i.e. Proposed, Draft and Internet Standards as the three ascending statuses). This model also has practical potential to managers of standards setting organizations and to firms (and individuals) interested in efficiently participating in standards setting processes. The model prediction is based on assessing the implicit social influence of the standards (based upon the social network metric, betweenness centrality, of the standards' authors) and the apparent importance of the standard to the network (based upon calculating the standard's prestige from the citation network).(cont.) A deeper understanding of the factors that go into this model was also developed through the analysis of the factors that can predict increased prestige over time for a standard. The overall systems approach and the tools developed and demonstrated in this thesis for the study of the Internet standards can be applied to other standards systems. Application (and extension) to the World Wide Web, electric power system, mobile communication, and others would we believe lead to important improvements in our practical and scholarly understanding of these systems.by Mo-Han Hsieh.Ph.D
    corecore