4,902 research outputs found

    Government mandated blocking of foreign Web content

    Full text link
    Blocking of foreign Web content by Internet access providers has been a hot topic for the last 18 months in Germany. Since fall 2001 the state of North-Rhine-Westphalia very actively tries to mandate such blocking. This paper will take a technical view on the problems imposed by the blocking orders and blocking content at access or network provider level in general. It will also give some empirical data on the effects of the blocking orders to help in the legal assessment of the orders.Comment: Preprint, revised 30.6.200

    Information Outlook, October 2004

    Get PDF
    Volume 8, Issue 10https://scholarworks.sjsu.edu/sla_io_2004/1009/thumbnail.jp

    Sending Jobs Overseas: The Cost to America’s Economy and Working Families

    Get PDF
    [Excerpt] This report was created by Working America and the AFL-CIO as a companion piece to Work­ing America’s Job Tracker, a ZIP code –searchable database of jobs exported (as well as Occupational Safety and Health Act violations and other work­place issues). Users can search their area for com­panies that have sent jobs overseas. Though Job Tracker is one of the largest publicly available, fully searchable records of the extent and specifics of outsourcing, it only reveals the tip of the iceberg. This report and Job Tracker contextualize each other—Job Tracker by mapping specific job losses due to outsourcing, the report by taking a broad view of the national-level numbers that are avail­able and offering case studies of key industries

    Integrating Real Life Cases Into A Security System: Seven Checklists For Managers

    Get PDF
    This paper examines seven recent real life cases related to computer and network security breaches, vulnerabilities, and successful security enforcements and then propose seven checklists for managers to consider when designing a security system. The checklists include (1) understanding the landscape of computer and network security, (2) putting together the basic safeguards, (3) identifying security threats, (4) identifying security measures and enforcement, (5) understanding the services of computer emergency response team, (6) preparing a comprehensive security system, and (7) the business continuity planning. If these checklists are followed they should increase the chances of success for designing and implementing a security system

    MementoMap: A Web Archive Profiling Framework for Efficient Memento Routing

    Get PDF
    With the proliferation of public web archives, it is becoming more important to better profile their contents, both to understand their immense holdings as well as to support routing of requests in Memento aggregators. A memento is a past version of a web page and a Memento aggregator is a tool or service that aggregates mementos from many different web archives. To save resources, the Memento aggregator should only poll the archives that are likely to have a copy of the requested Uniform Resource Identifier (URI). Using the Crawler Index (CDX), we generate profiles of the archives that summarize their holdings and use them to inform routing of the Memento aggregator’s URI requests. Additionally, we use full text search (when available) or sample URI lookups to build an understanding of an archive’s holdings. Previous work in profiling ranged from using full URIs (no false positives, but with large profiles) to using only top-level domains (TLDs) (smaller profiles, but with many false positives). This work explores strategies in between these two extremes. For evaluation we used CDX files from Archive-It, UK Web Archive, Stanford Web Archive Portal, and Arquivo.pt. Moreover, we used web server access log files from the Internet Archive’s Wayback Machine, UK Web Archive, Arquivo.pt, LANL’s Memento Proxy, and ODU’s MemGator Server. In addition, we utilized historical dataset of URIs from DMOZ. In early experiments with various URI-based static profiling policies we successfully identified about 78% of the URIs that were not present in the archive with less than 1% relative cost as compared to the complete knowledge profile and 94% URIs with less than 10% relative cost without any false negatives. In another experiment we found that we can correctly route 80% of the requests while maintaining about 0.9 recall by discovering only 10% of the archive holdings and generating a profile that costs less than 1% of the complete knowledge profile. We created MementoMap, a framework that allows web archives and third parties to express holdings and/or voids of an archive of any size with varying levels of details to fulfil various application needs. Our archive profiling framework enables tools and services to predict and rank archives where mementos of a requested URI are likely to be present. In static profiling policies we predefined the maximum depth of host and path segments of URIs for each policy that are used as URI keys. This gave us a good baseline for evaluation, but was not suitable for merging profiles with different policies. Later, we introduced a more flexible means to represent URI keys that uses wildcard characters to indicate whether a URI key was truncated. Moreover, we developed an algorithm to rollup URI keys dynamically at arbitrary depths when sufficient archiving activity is detected under certain URI prefixes. In an experiment with dynamic profiling of archival holdings we found that a MementoMap of less than 1.5% relative cost can correctly identify the presence or absence of 60% of the lookup URIs in the corresponding archive without any false negatives (i.e., 100% recall). In addition, we separately evaluated archival voids based on the most frequently accessed resources in the access log and found that we could have avoided more than 8% of the false positives without introducing any false negatives. We defined a routing score that can be used for Memento routing. Using a cut-off threshold technique on our routing score we achieved over 96% accuracy if we accept about 89% recall and for a recall of 99% we managed to get about 68% accuracy, which translates to about 72% saving in wasted lookup requests in our Memento aggregator. Moreover, when using top-k archives based on our routing score for routing and choosing only the topmost archive, we missed only about 8% of the sample URIs that are present in at least one archive, but when we selected top-2 archives, we missed less than 2% of these URIs. We also evaluated a machine learning-based routing approach, which resulted in an overall better accuracy, but poorer recall due to low prevalence of the sample lookup URI dataset in different web archives. We contributed various algorithms, such as a space and time efficient approach to ingest large lists of URIs to generate MementoMaps and a Random Searcher Model to discover samples of holdings of web archives. We contributed numerous tools to support various aspects of web archiving and replay, such as MemGator (a Memento aggregator), Inter- Planetary Wayback (a novel archival replay system), Reconstructive (a client-side request rerouting ServiceWorker), and AccessLog Parser. Moreover, this work yielded a file format specification draft called Unified Key Value Store (UKVS) that we use for serialization and dissemination of MementoMaps. It is a flexible and extensible file format that allows easy interactions with Unix text processing tools. UKVS can be used in many applications beyond MementoMaps

    Cache Conscious Data Layouting for In-Memory Databases

    Get PDF
    Many applications with manually implemented data management exhibit a data storage pattern in which semantically related data items are stored closer in memory than unrelated data items. The strong sematic relationship between these data items commonly induces contemporary accesses to them. This is called the principle of data locality and has been recognized by hardware vendors. It is commonly exploited to improve the performance of hardware. General Purpose Database Management Systems (DBMSs), whose main goal is to simplify optimal data storage and processing, generally fall short of this claim because the usage pattern of the stored data cannot be anticipated when designing the system. The current interest in column oriented databases indicates that one strategy does not fit all applications. A DBMS that automatically adapts it’s storage strategy to the workload of the database promises a significant performance increase by maximizing the benefit of hardware optimizations that are based on the principle of data locality. This thesis gives an overview of optimizations that are based on the principle of data locality and the effect they have on the data access performance of applications. Based on the findings, a model is introduced that allows an estimation of the costs of data accesses based on the arrangement of the data in the main memory. This model is evaluated through a series of experiments and incorporated into an automatic layouting component for a DBMS. This layouting component allows the calculation of an analytically optimal storage layout. The performance benefits brought by this component are evaluated in an application benchmark

    Design of an Intrusion Detection System (IDS) and an Intrusion Prevention System (IPS) for the EIU Cybersecurity Laboratory

    Get PDF
    Cyber Security will always be a subject of discussion for a long time to come. Research has shown that there is massive growth of cyber-crime and the currently available number of Cyber Security experts to counter this is limited. Although there are multiple resources discussing Cyber Security, but access to training in practical applications is limited. As an institution, Eastern Illinois University (EIU) is set to start Masters of Science in Cyber Security in Fall 2017. Then the challenge is how EIU will expose students to the practical reality of Cyber Security where they can learn different detection, prevention and incidence analysis techniques of cyber-attacks. In addition, students should have the opportunity to learn cyber-attacks legally. This research proposes a solution for these needs by focusing on the design of firewall architecture with an Intrusion Detection System (IDS) and Intrusion Prevention System (IPS) for the EIU Cyber Security Laboratory. This thesis explores different up to date techniques and methods for detection and prevention of cyber-attacks. The overall outcome of this research is to design a public testing site that invites hackers to attack for the purpose of detection, prevention and security incidence analysis. This public firewall might empower students and instructors with practical cyber-attacks, detection techniques, prevention techniques, and forensics analysis tools. It may also provide the knowledge required for further research in the field of Cyber Security

    Productivity improvements at small manufacturing company’s shop-floor

    Get PDF
    An effort to increase and improve the productivity has been started by the case company at which this thesis was conducted. Various productivity measures and improvements are being currently being studied to achieve a leaner production and easy material flow. In productivity improvements the ultimate goal is to seek out new production manufacturing processes that can increase the company’s overall efficiency in order to stay competitive in the market. This could be accomplished by the reduction or total elimination of losses that occur mechanically or through a bad operational process. The aim of this thesis is to develop a simple productivity measuring of the manufacturing machine and also to make a bidding calculator or sales tool to be used to speed the rate at which they set bids. By using literature from previous studies, unstructured interviews of managers, supervisors and employees the tasks to be performed were set. A recording sheet was created to record the rate at which the machines at the shop-floor were being utilized and this study we named as machine hour study. This study is to help us to understand the current state of utilization so as to figure out measures to increase the productivity. The second task was to modify the bidding calculator or sales tool to be more user-friendly. Lastly, the author was to recommend other projects that could help in the improvement of productivity at the shop-floor. The significant result is the fact that the machines being used in manufacturing were being underutilized which was agreed by the supervising managers at the shop-floor. Some of the reasons pertaining to this result are too much setup in a working day, long searchers for tools, not enough work, etc.fi=Opinnäytetyö kokotekstinä PDF-muodossa.|en=Thesis fulltext in PDF format.|sv=Lärdomsprov tillgängligt som fulltext i PDF-format
    • …
    corecore