425 research outputs found

    Computing and evolving variants of computational depth

    Get PDF
    The structure and organization of information in binary strings and (infinite) binary sequences are investigated using two computable measures of complexity related to computational depth. First, fundamental properties of recursive computational depth, a refinement of Bennett\u27s original notion of computational depth, are developed, and it is shown that the recursively weakly (respectively, strongly) deep sequences form a proper subclass of the class of weakly (respectively, strongly) deep sequences. It is then shown that every weakly useful sequence is recursively strongly deep, strengthening a theorem by Juedes, Lathrop, and Lutz. It follows from these results that not every strongly deep sequence is weakly useful, thereby answering an open question posed by Juedes;Second, compression depth, a feasibly computable depth measurement, is developed based on the Lempel-Ziv compression algorithm. LZ compression depth is further formalized by introducing strongly (compression) deep sequences and showing that analogues of the main properties of computational depth hold for compression depth. Critical to these results, it is shown that a sequence that is not normal must be compressible by the Lempei-Ziv algorithm. This yields a new, simpler proof that the Champernowne sequence is normal;Compression depth is also used to measure the organization of genes in genetic algorithms. Using finite-state machines to control the actions of an automaton playing prisoner\u27s dilemma, a genetic algorithm is used to evolve a population of finite-state machines (players) to play prisoner\u27s dilemma against each other. Since the fitness function is based solely on how well a player performs against all other players in the population, any accumulation of compression depth (organization) in the genetic structure of the player can only by attributed to the fact that more fit players have a more highly organized genetic structure. It is shown experimentally that this is the case

    Clustering by means of a Boltzmann machine with partial constraint satisfaction

    Get PDF
    The clustering problem refers to the partitioning of target sightings into sets. Two sightings are in the same set if and only if they are generated by sensor detections of the same target and are in the same great circle arc (GARC) trajectory of that target. A Boltzmann machine is developed whose sparse architecture provides for only partial constraint satisfaction of the associated cost function. This together with a special graphics interface serve as an aid in determining GARCs. Our approach differs from others in that the neural net is built to operate in conjunction with a non-neural tracker. This further restricts the architectural complexity of the network and facilitates future experimentation regarding decomposition of the neural net across several Von Neumann processors. Also, the Boltzmann machine architecture eases the effort of finding optimal or near optimal solutions. Results are presented. The demonstrated feasibility of neural GARC determination encourages investigation into the extension of its role in the track formation process utilizing an environment that includes supercomputers, neurocomputers, or optical hardware. The network architecture is capable of identifying a host of geometric forms other than GARCs and can thus be used in several domains including space, land, and ocean

    Extending the Cyber Capabilities of Small to Midsize Businesses

    Get PDF
    This project explores disparities in the cybersecurity practices of small to midsize businesses in comparison to larger organizations with more resources to allocate to cybersecurity. While the adoption of technical solutions offers many advantages, SMBs are struggling to maintain good cybersecurity practices in this era of digital transformation. Considering the overall security climate it is clear that SMBs are vulnerable to cyber threats, are being attacked more often and lack the proper resources or knowledge to effectively address threats. This paper proposes a model for SMBs to enhance their cyber capabilities with cybersecurity assessments and regular training provided by the National Guard’s Defensive Cyber Operations Element (DCO-E). Leveraging the capabilities of the DCO-E, in effect a “national cybersecurity squad,” to support a national cyber readiness and education campaign could be an effective method to enhance the cybersecurity of SMBs. The proposed model is supported with a initial survey results showing a promising willingness and support from SMBs

    Solar Decathlon Controls

    Get PDF

    Computational Depth and Reducibility

    Get PDF
    This paper investigates Bennett\u27s notions of strong and weak computational depth (also called logical depth) for infinite binary sequences. Roughly, an infinite binary sequence x is defined to be weakly useful if every element of a non-negligible set of decidable sequences is reducible to x in recursively bounded time. It is shown that every weakly useful sequence is strongly deep. This result (which generalizes Bennett\u27s observation that the halting problem is strongly deep) implies that every high Turing degree contains strongly deep sequences. It is also shown that, in the sense of Baire category, almost every infinite binary sequence is weakly deep, but not strongly deep
    corecore