83 research outputs found

    Being a Girl Gamer - A Correspondence with Caitlin Martin V1.0

    Full text link
    I promised Jon I’d write about this for the book on gender online, so here’s a first run at my thoughts. I thought it’d be good to put some ideas out there, see what they look like (may even discuss them) before writing something more formal & thought out. Here goes

    Escaping the Computer-Forensics Certification Maze: A Survey of Professional Certifications

    Get PDF
    With the proliferation of computer crime, the demand for computer-forensics experts continues to increase. Yet with so many computer-forensics certifications currently available, it is not an easy task for those outside the discipline to understand the differences among the various certifications. The objective of this paper is to provide a comprehensive analysis of all the existing computer-forensics certifications for the benefits of non-computer- forensics practitioners. Twenty-six computer-forensics certifications offered by 17 different associations are described and compared based on criteria such as certification requirements and knowledge coverage. The paper is useful to three groups of readers: (1) individuals who want to join the computer-forensics profession; (2) academics who are responsible for curriculum development in computer forensics; and (3) top-level managers who want to recruit computer-forensics professionals

    Internet... the final frontier: an ethnographic account: exploring the cultural space of the Net from the inside

    Get PDF
    The research project The Internet as a space for interaction, which completed its mission in Autumn 1998, studied the constitutive features of network culture and network organisation. Special emphasis was given to the dynamic interplay of technical and social conventions regarding both the Net’s organisation as well as its change. The ethnographic perspective chosen studied the Internet from the inside. Research concentrated upon three fields of study: the hegemonial operating technology of net nodes (UNIX) the network’s basic transmission technology (the Internet Protocol IP) and a popular communication service (Usenet). The project’s final report includes the results of the three branches explored. Drawing upon the development in the three fields it is shown that changes that come about on the Net are neither anarchic nor arbitrary. Instead, the decentrally organised Internet is based upon technically and organisationally distributed forms of coordination within which individual preferences collectively attain the power of developing into definitive standards. --

    When Is a User Not a User - Finding the Proper Role for Republication Liability on the Internet

    Get PDF

    The Best Nix for a Combined Honeypot Sensor Server

    Get PDF
    The paper will examine (through case-study) the usability of open source operating systems software for a combined Honeypot sensor server. The study will scrutinize the use of two Unix variants, Linux Red Hat and the Sun Solaris operating systems as candidates for deployment of a combined Honeypot sensor server. Appropriate unbiased metrics, such as extensibility, reliability, ease of install and use, will be employed as a likely criterion to evaluate the operating systems for the role of hosting Honeypot sensor server software

    A Survey and Comparative Study on Vulnerability Scanning Tools

    Get PDF
    Vulnerability scanners are a tool used by many organizations and developers as part of their vulnerability management. These scanners aid in the security of applications, databases, networks, etc. There are many different options available for vulnerability scanners that vary in the analysis method they encompass or target for which they scan, among many other features. This thesis explores the different types of scanners available and aims to ease the burden of selecting the ideal vulnerability scanner for one’s needs by conducting a survey and comparative analysis of vulnerability scanners. Before diving into the vulnerability scanners available, background information is provided regarding the types of testing a vulnerability scanner may use as well as the types of vulnerability scanners out there. This thesis highlights application scanners, database scanners, and network-based scanners as those were the types of vulnerability scanners primarily found in the survey. This thesis also compares the accuracy of two network scanners—OpenVAS and Nessus—when scanning the same target and discusses the results and their implications

    A Survey and Comparative Study on Vulnerability Scanning Tools

    Get PDF
    Vulnerability scanners are a tool used by many organizations and developers as part of their vulnerability management. These scanners aid in the security of applications, databases, networks, etc. There are many different options available for vulnerability scanners that vary in the analysis method they encompass or target for which they scan, among many other features. This thesis explores the different types of scanners available and aims to ease the burden of selecting the ideal vulnerability scanner for one’s needs by conducting a survey and comparative analysis of vulnerability scanners. Before diving into the vulnerability scanners available, background information is provided regarding the types of testing a vulnerability scanner may use as well as the types of vulnerability scanners out there. This thesis highlights application scanners, database scanners, and network-based scanners as those were the types of vulnerability scanners primarily found in the survey. This thesis also compares the accuracy of two network scanners—OpenVAS and Nessus—when scanning the same target and discusses the results and their implications

    RFID FOR FOOD INVENTORY

    Get PDF
    The aim of this project is to develop a system for food inventory using Radio Frequency Identification (RFID) as the identification. The system is a wireless communication technology that is used to uniquely identify tagged foods in food inventory at home. The system is an integrated system of Radio Frequency Identification components which are the tag, the interrogator and the controller. The tag that will be used in the system is the passive tag. The frequency that will be used in the system is Low Frequency (LF) range. The interface of the controller will be designed by using the Microsoft Visual Studio 2005. Microsoft SQL will be the media to store the database of the food stored. The user only need to key in the data of each food once and the system will memorize it. The system will identify the status of the stored food base on the database stored and notify the user each time the food being used. A warning window will appear to notify the user if the status of the food is below the determined specification

    Improving Salience Retention and Identification in the Automated Filtering of Event Log Messages

    No full text
    Event log messages are currently the only genuine interface through which computer systems administrators can effectively monitor their systems and assemble a mental perception of system state. The popularisation of the Internet and the accompanying meteoric growth of business-critical systems has resulted in an overwhelming volume of event log messages, channeled through mechanisms whose designers could not have envisaged the scale of the problem. Messages regarding intrusion detection, hardware status, operating system status changes, database tablespaces, and so on, are being produced at the rate of many gigabytes per day for a significant computing environment. Filtering technologies have not been able to keep up. Most messages go unnoticed; no filtering whatsoever is performed on them, at least in part due to the difficulty of implementing and maintaining an effective filtering solution. The most commonly-deployed filtering alternatives rely on regular expressions to match pre-defi ned strings, with 100% accuracy, which can then become ineffective as the code base for the software producing the messages 'drifts' away from those strings. The exactness requirement means all possible failure scenarios must be accurately anticipated and their events catered for with regular expressions, in order to make full use of this technique. Alternatives to regular expressions remain largely academic. Data mining, automated corpus construction, and neural networks, to name the highest-profi le ones, only produce probabilistic results and are either difficult or impossible to alter in any deterministic way. Policies are therefore not supported under these alternatives. This thesis explores a new architecture which utilises rich metadata in order to avoid the burden of message interpretation. The metadata itself is based on an intention to improve end-to-end communication and reduce ambiguity. A simple yet effective filtering scheme is also presented which fi lters log messages through a short and easily-customisable set of rules. With such an architecture, it is envisaged that systems administrators could signi ficantly improve their awareness of their systems while avoiding many of the false-positives and -negatives which plague today's fi ltering solutions

    Internet... the final frontier: an ethnographic account ; exploring the cultural space of the net from the inside

    Full text link
    "The research project 'The Internet as a space for interaction', which completed its mission in Autumn 1998, studied the constitutive features of network culture and network organisation. Special emphasis was given to the dynamic interplay of technical and social conventions regarding both the net's organisation as well as its change. The ethnographic perspective chosen studied the Internet from the inside. Research concentrated upon three fields of study: the hegemonial operating technology of net nodes (UNIX) the network’s basic transmission technology (the Internet Protocol IP) and a popular communication service (Usenet). The project's final report includes the results of the three branches explored. Drawing upon the development in the three fields it is shown that changes that come about on the Net are neither anarchic nor arbitrary. Instead, the decentrally organised Internet is based upon technically and organisationally distributed forms of coordination within which individual preferences collectively attain the power of developing into definitive standards." (author's abstract)"Das im Herbst 1998 abgeschlossene Forschungsprojekt 'Interaktionsraum Internet' hat sich mit den konstitutiven Merkmalen der Netzkultur und Netzwerkorganisation beschäftigt. Im Vordergrund des Interesses stand das dynamische Zusammenspiel technischer und gesellschaftlicher Konventionen in der Organisation wie auch im Wandel des Netzes. Die ethnographisch angeleitete Binnenperspektive auf das Internet konzentrierte sich auf drei ausgewählte Bereiche, um Prozesse der Institutionenbildung und die Formen ihrer Transformation zu studieren: die hegemoniale Betriebstechnik der Netzknoten (UNIX), die grundlegende Übertragungstechnik im Netz (das Internet Protokoll IP) und einen populären Kommunikationsdienst (Usenet). Der Schlußbericht des Projekts enthält die Ergebnisse der drei Untersuchungsstränge. Gezeigt wird anhand der Entwicklung in den drei Feldern, daß sich der Wandel des Netzes weder beliebig noch anarchisch vollzieht. Das dezentral organisierte Internet beruht vielmehr auf technisch wie organisatorisch verteilten Formen der Koordination, in denen individuelle Handlungspräferenzen kollektiv definitionsmächtig werden." (Autorenreferat
    • …
    corecore