9,124 research outputs found

    NSA Suite-B Cryptography algorithms

    Get PDF
    Tato bakalářská práce s názvem Skupina algoritmů NSA Suite B Cryptography se zabývá algoritmy, které obsahuje sada Suite B. Ty slouží k větší ochraně dat a bezpečnosti při jejich přenosu přes nezabezpečené prostředí internetu. Dále obsahuje rozbor těchto algoritmů, jejich šifrování, dešifrování, tvorbu hashe, výměnu klíčů, vytvoření podpisu a jeho ověřování. Algoritmy jsou porovnávány s ostatními, s již zastaralými nebo stále používanými. Následuje popis několika protokolů, které ke své činnosti využívají zmíněné algoritmy. Výstup z praktické části je webová prezentace, která může sloužit i jako výukový materiál.This thesis entitled Algorithms Group SuiteB NSA Cryptography deals with algorithms, which are included in the set SuiteB. They serve to increase data protection and security during their transmission over an unsecured Internet environment. It also contains analysis of these algorithms, the encryption, decryption, creating hashes, key exchange, creating a signature and its verification. The algorithms are compared with others, already obsolete, or still used. The following are protocols using these algorithms. Output from the practical part is a web presentation that can serve as a teaching material.

    Investigation into the impacts of migration to emergent NSA Suite B encryption standards

    Get PDF
    As information sharing becomes increasingly necessary for mission accomplishment within the Department of Defense, the rules for protecting information have tightened. The sustained and rapid advancement of information technology in the 21st century dictates the adoption of a flexible and adaptable cryptographic strategy for protecting national security information. RSA techniques, while formidable, have begun to present vulnerabilities to the raw computing power that is commercially available today. This thesis is a comprehensive characterization of the current state of the art in DoD encryption standards. It will emphasize the mathematical algorithms that facilitate legacy encryption and its proposed NSA Suite B replacements. We will look at how the new technology addresses the latest threats and vulnerabilities that legacy methods do not fully mitigate. It will then summarize the findings of the security capabilities of NSA Suite B standards as compared to the costs in manpower and money to implement them, and suggest how to best utilize NSA Suite B technology for the purpose of providing confidentiality, integrity and availability in an environment with real world threats.http://archive.org/details/investigationint109454675Department of Defense author (civilian).Approved for public release; distribution is unlimited

    Scaling gridded river networks for macroscale hydrology: Development, analysis, and control of error

    Get PDF
    A simple and robust river network scaling algorithm (NSA) is presented to rescale fine‐resolution networks to any coarser resolution. The algorithm was tested over the Danube River basin and the European continent. Coarse‐resolution networks, at 2.5, 5, 10, and 30 min resolutions, were derived from higher‐resolution gridded networks using NSA and geomorphometric attributes, such as river order, shape index, and width function. These parameters were calculated and compared at each resolution. Simple scaling relationships were found to predict decreasing river lengths with coarser‐resolution data. This relationship can be used to correct river length as a function of grid resolution. The length‐corrected width functions of the major river basins in Europe were compared at different resolutions to assess river network performance. The discretization error in representing basin area and river lengths at coarser resolutions were analyzed, and simple relationships were found to calculate the minimum number of grid cells needed to maintain the catchment area and length within a desired level of accuracy. This relationship among geomorphological characteristics, such as shape index and width function (derived from gridded networks at different resolutions), suggests that a minimum of 200–300 grid cells is necessary to maintain the geomorphological characteristics of the river networks with sufficient accuracy

    On the Design of LIL Tests for (Pseudo) Random Generators and Some Experimental Results

    Get PDF
    NIST SP800-22 (2010) proposes the state of art testing suite for (pseudo) random generators to detect deviations of a binary sequence from randomness. On the one hand, as a counter example to NIST SP800-22 test suite, it is easy to construct functions that are considered as GOOD pseudorandom generators by NIST SP800-22 test suite though the output of these functions are easily distinguishable from the uniform distribution. Thus these functions are not pseudorandom generators by definition. On the other hand, NIST SP800-22 does not cover some of the important laws for randomness. Two fundamental limit theorems about random binary strings are the central limit theorem and the law of the iterated logarithm (LIL). Several frequency related tests in NIST SP800-22 cover the central limit theorem while no NIST SP800-22 test covers LIL. This paper proposes techniques to address the above challenges that NIST SP800-22 testing suite faces. Firstly, we propose statistical distance based testing techniques for (pseudo) random generators to reduce the above mentioned Type II errors in NIST SP800-22 test suite. Secondly, we propose LIL based statistical testing techniques, calculate the probabilities, and carry out experimental tests on widely used pseudorandom generators by generating around 30TB of pseudorandom sequences. The experimental results show that for a sample size of 1000 sequences (2TB), the statistical distance between the generated sequences and the uniform distribution is around 0.07 (with 00 for statistically indistinguishable and 11 for completely distinguishable) and the root-mean-square deviation is around 0.005

    RAPTOR: Routing Attacks on Privacy in Tor

    Full text link
    The Tor network is a widely used system for anonymous communication. However, Tor is known to be vulnerable to attackers who can observe traffic at both ends of the communication path. In this paper, we show that prior attacks are just the tip of the iceberg. We present a suite of new attacks, called Raptor, that can be launched by Autonomous Systems (ASes) to compromise user anonymity. First, AS-level adversaries can exploit the asymmetric nature of Internet routing to increase the chance of observing at least one direction of user traffic at both ends of the communication. Second, AS-level adversaries can exploit natural churn in Internet routing to lie on the BGP paths for more users over time. Third, strategic adversaries can manipulate Internet routing via BGP hijacks (to discover the users using specific Tor guard nodes) and interceptions (to perform traffic analysis). We demonstrate the feasibility of Raptor attacks by analyzing historical BGP data and Traceroute data as well as performing real-world attacks on the live Tor network, while ensuring that we do not harm real users. In addition, we outline the design of two monitoring frameworks to counter these attacks: BGP monitoring to detect control-plane attacks, and Traceroute monitoring to detect data-plane anomalies. Overall, our work motivates the design of anonymity systems that are aware of the dynamics of Internet routing

    Comparison of boreal ecosystem model sensitivity to variability in climate and forest site parameters

    Get PDF
    Ecosystem models are useful tools for evaluating environmental controls on carbon and water cycles under past or future conditions. In this paper we compare annual carbon and water fluxes from nine boreal spruce forest ecosystem models in a series of sensitivity simulations. For each comparison, a single climate driver or forest site parameter was altered in a separate sensitivity run. Driver and parameter changes were prescribed principally to be large enough to identify and isolate any major differences in model responses, while also remaining within the range of variability that the boreal forest biome may be exposed to over a time period of several decades. The models simulated plant production, autotrophic and heterotrophic respiration, and evapotranspiration (ET) for a black spruce site in the boreal forest of central Canada (56°N). Results revealed that there were common model responses in gross primary production, plant respiration, and ET fluxes to prescribed changes in air temperature or surface irradiance and to decreased precipitation amounts. The models were also similar in their responses to variations in canopy leaf area, leaf nitrogen content, and surface organic layer thickness. The models had different sensitivities to certain parameters, namely the net primary production response to increased CO2 levels, and the response of soil microbial respiration to precipitation inputs and soil wetness. These differences can be explained by the type (or absence) of photosynthesis-CO2 response curves in the models and by response algorithms of litter and humus decomposition to drying effects in organic soils of the boreal spruce ecosystem. Differences in the couplings of photosynthesis and soil respiration to nitrogen availability may also explain divergent model responses. Sensitivity comparisons imply that past conditions of the ecosystem represented in the models\u27 initial standing wood and soil carbon pools, including historical climate patterns and the time since the last major disturbance, can be as important as potential climatic changes to prediction of the annual ecosystem carbon balance in this boreal spruce forest

    Services to Industry by Libraries of Federal Government Agencies

    Get PDF
    published or submitted for publicatio

    Cross-enterprise access control security for electronic health records: Technical, practical and legislation impact

    Get PDF
    In this thesis we investigate the relationship of security, privacy, legislation, computational power in relation to Cross-Enterprise User Assertions (XUA), which allows us to develop the recommendations for the appropriate, architecture, functionality, cryptographic algorithms, and key lengths. The evolution of health records from paper to electronic media promises to be an important part of improving the quality of health care. The diversity of organizations, systems, geography,laws and regulations create a significant challenge for ensuring the privacy of Electronic Health Records (EHRs), while maintaining availability. XUA is a technology that attempts to address the problem of sharing EHRs across enterprise boundaries. We rely on NSA suite B cryptography to provide the fundamental framework of the minimum security requirements at the 128 bit security level. We also recommend the use of the National Institute of Standards and Technologys (NIST) FIPS 140-2 specification to establish confidence in the software\u27s security features
    corecore