60 research outputs found

    ANN for Tic-Tac-Toe Learning

    Get PDF
    Throughout this research, imposing the training of an Artificial Neural Network (ANN) to play tic-tac-toe bored game, by training the ANN to play the tic-tac-toe logic using the set of mathematical combination of the sequences that could be played by the system and using both the Gradient Descent Algorithm explicitly and the Elimination theory rules implicitly. And so on the system should be able to produce imunate amalgamations to solve every state within the game course to make better of results of winnings or getting draw

    ANN for Tic-Tac-Toe Learning

    Get PDF
    Throughout this research, imposing the training of an Artificial Neural Network (ANN) to play tic-tac-toe bored game, by training the ANN to play the tic-tac-toe logic using the set of mathematical combination of the sequences that could be played by the system and using both the Gradient Descent Algorithm explicitly and the Elimination theory rules implicitly. And so on the system should be able to produce imunate amalgamations to solve every state within the game course to make better of results of winnings or getting draw

    Performance Debugging and Tuning using an Instruction-Set Simulator

    Get PDF
    Instruction-set simulators allow programmers a detailed level of insight into, and control over, the execution of a program, including parallel programs and operating systems. In principle, instruction set simulation can model any target computer and gather any statistic. Furthermore, such simulators are usually portable, independent of compiler tools, and deterministic-allowing bugs to be recreated or measurements repeated. Though often viewed as being too slow for use as a general programming tool, in the last several years their performance has improved considerably. We describe SIMICS, an instruction set simulator of SPARC-based multiprocessors developed at SICS, in its rôle as a general programming tool. We discuss some of the benefits of using a tool such as SIMICS to support various tasks in software engineering, including debugging, testing, analysis, and performance tuning. We present in some detail two test cases, where we've used SimICS to support analysis and performance tuning of two applications, Penny and EQNTOTT. This work resulted in improved parallelism in, and understanding of, Penny, as well as a performance improvement for EQNTOTT of over a magnitude. We also present some early work on analyzing SPARC/Linux, demonstrating the ability of tools like SimICS to analyze operating systems

    A survey of an introduction to fault diagnosis algorithms

    Get PDF
    This report surveys the field of diagnosis and introduces some of the key algorithms and heuristics currently in use. Fault diagnosis is an important and a rapidly growing discipline. This is important in the design of self-repairable computers because the present diagnosis resolution of its fault-tolerant computer is limited to a functional unit or processor. Better resolution is necessary before failed units can become partially reuseable. The approach that holds the greatest promise is that of resident microdiagnostics; however, that presupposes a microprogrammable architecture for the computer being self-diagnosed. The presentation is tutorial and contains examples. An extensive bibliography of some 220 entries is included

    Гарантоздатність як фундаментальний узагальнюючий та інтегруючий підхід

    Get PDF
    Представлені головні принципи та умови становлення і етапи розвитку фундаментальної теорії і практики узагальнюючих та інтегруючих концепцій гарантоздатності, починаючи з першої об’єднаної конференції двох наукових шкіл Дж. фон Неймана і Н. Вінера. Приведені основні положення розвитку теоретичних засад і результатів прикладних досліджень інтеграційних процесів безвідмовності (надійності), відмовостійкості та гарантоздатності інформаційно-управляючих комп’ютерних систем (ІУКС).Представлены главные принципы, условия становления и этапы развития фундаментальной теории и практики обобщающих и интегрирующих концепций гарантоспособности, начиная с первой объединённой конференции двух научных школ Дж. фон Неймана и Н. Винера. Приведены основные положения развития теоретических основ и результатов прикладных исследований интеграционных процессов безотказности (надёжности), отказоустойчивости и гарантоспособности информационно-управляющих компьютерных систем (ИУКС).The main principles and conditions of formation and stages of development of the fundamental theory and practice of the generalizing and integrating concepts of dependability were presented after the leadthrough the first joint conference of two scientific schools of J. von Neumann and N. Wiener. The basic aspects of development of theoretical principles and results of applied researches of integration processes of reliability (fail-safety)

    The microeconometrics of household behaviour: building the foundations, 1920-1960

    Get PDF
    This thesis explores the early history of microeconometrics of household behaviour from the interwar period to the 1960s. The analytical framework relies on a model of empirical knowledge production that captures the scientific progress in terms of its materialistic supplies and intellectual demands. Under this framework, the thesis traces how microdata at the household level first appeared in the econometrician’s research agenda and how the discipline was consolidated by communities of econometricians. This study contains four substantive chapters. The first three chapters are selective case-studies charting three important approaches in the development of microeconometric practices. The first chapter reviews the interwar literature. Among those decentralised practices, Arthur Bowley’s analysis on family expenditure stands out as one of the earliest exemplars. The second and third chapters explore the formation of two communities in the post-war period: Richard Stone’s Department of Applied Economics (DAE) at Cambridge, and Guy Orcutt’s Social Systems Research Institute (SSRI) at Wisconsin. With the benefit of the new microdata and the introduction of computer-based calculation, Stone and his crew created a cooperative group that produced the first series of microeconometric publications driven by intellectual problems and economic questions. By contrast, Orcutt came to the analysis of microdata driven by his dream of microsimulation, a bottom-up method of microeconometric modelling, more heavily dependent on computing power and designed for revising public policies. After frustration at the SSRI, he finally finished a household simulation model at the Urban Institute. Taking the DAE and SSRI as examples, the fourth chapter assesses both the internal academic relationships of these groups, and the consolidation of both literatures using bibliometric data and network analysis. The results demonstrate the ways in which the DAE was a more interconnected network than the SSRI. The citation analysis offers an alternative way in understanding the formation of econometric knowledge based on community relations rather than the supply of materials or intellectual demands

    Biweekly Report, June 1, 1953

    Get PDF
    Bi-weekly progress report of the Scientific and Engineering Computation Group

    Putting the Semantics into Semantic Versioning

    Full text link
    The long-standing aspiration for software reuse has made astonishing strides in the past few years. Many modern software development ecosystems now come with rich sets of publicly-available components contributed by the community. Downstream developers can leverage these upstream components, boosting their productivity. However, components evolve at their own pace. This imposes obligations on and yields benefits for downstream developers, especially since changes can be breaking, requiring additional downstream work to adapt to. Upgrading too late leaves downstream vulnerable to security issues and missing out on useful improvements; upgrading too early results in excess work. Semantic versioning has been proposed as an elegant mechanism to communicate levels of compatibility, enabling downstream developers to automate dependency upgrades. While it is questionable whether a version number can adequately characterize version compatibility in general, we argue that developers would greatly benefit from tools such as semantic version calculators to help them upgrade safely. The time is now for the research community to develop such tools: large component ecosystems exist and are accessible, component interactions have become observable through automated builds, and recent advances in program analysis make the development of relevant tools feasible. In particular, contracts (both traditional and lightweight) are a promising input to semantic versioning calculators, which can suggest whether an upgrade is likely to be safe.Comment: to be published as Onward! Essays 202
    corecore