9,593 research outputs found

    Computable Rationality, NUTS, and the Nuclear Leviathan

    Get PDF
    This paper explores how the Leviathan that projects power through nuclear arms exercises a unique nuclearized sovereignty. In the case of nuclear superpowers, this sovereignty extends to wielding the power to destroy human civilization as we know it across the globe. Nuclearized sovereignty depends on a hybrid form of power encompassing human decision-makers in a hierarchical chain of command, and all of the technical and computerized functions necessary to maintain command and control at every moment of the sovereign's existence: this sovereign power cannot sleep. This article analyzes how the form of rationality that informs this hybrid exercise of power historically developed to be computable. By definition, computable rationality must be able to function without any intelligible grasp of the context or the comprehensive significance of decision-making outcomes. Thus, maintaining nuclearized sovereignty necessarily must be able to execute momentous life and death decisions without the type of sentience we usually associate with ethical individual and collective decisions

    Coalition Battle Management Language (C-BML) Study Group Final Report

    Get PDF
    Interoperability across Modeling and Simulation (M&S) and Command and Control (C2) systems continues to be a significant problem for today\u27s warfighters. M&S is well-established in military training, but it can be a valuable asset for planning and mission rehearsal if M&S and C2 systems were able to exchange information, plans, and orders more effectively. To better support the warfighter with M&S based capabilities, an open standards-based framework is needed that establishes operational and technical coherence between C2 and M&S systems

    What is a Good Plan? Cultural Variations in Expert Planners’ Concepts of Plan Quality

    No full text
    This article presents the results of a field research study examining commonalities and differences between American and British operational planners’ mental models of planning. We conducted Cultural Network Analysis (CNA) interviews with 14 experienced operational planners in the US and UK. Our results demonstrate the existence of fundamental differences between the way American and British expert planners conceive of a high quality plan. Our results revealed that the American planners’ model focused on specification of action to achieve synchronization, providing little autonomy at the level of execution, and included the belief that increasing contingencies reduces risk. The British planners’ model stressed the internal coherence of the plan, to support shared situational awareness and thereby flexibility at the level of execution. The British model also emphasized the belief that reducing the number of assumptions decreases risk. Overall, the American ideal plan serves a controlling function, whereas the British ideal plan supports an enabling function. Interestingly, both the US and UK would view the other’s ideal plan as riskier than their own. The implications of cultural models of plans and planning are described for establishing performance measures and designing systems to support multinational planning teams

    US/UK Mental Models of Planning: The Relationship Between Plan Detail and Plan Quality

    No full text
    This paper presents the results of a research study applying a new cultural analysis method to capture commonalities and differences between US and UK mental models of operational planning. The results demonstrate the existence of fundamental differences between the way US and UK planners think about what it means to have a high quality plan. Specifically, the present study captures differences in how US and UK planners conceptualize plan quality. Explicit models of cultural differences in conceptions of plan quality are useful for establishing performance metrics for multinational planning teams. This paper discusses the prospects of enabling automatic evaluation of multinational team performance by combining recent advances in cultural modelling with enhanced ontology languages

    Mapping Big Data into Knowledge Space with Cognitive Cyber-Infrastructure

    Full text link
    Big data research has attracted great attention in science, technology, industry and society. It is developing with the evolving scientific paradigm, the fourth industrial revolution, and the transformational innovation of technologies. However, its nature and fundamental challenge have not been recognized, and its own methodology has not been formed. This paper explores and answers the following questions: What is big data? What are the basic methods for representing, managing and analyzing big data? What is the relationship between big data and knowledge? Can we find a mapping from big data into knowledge space? What kind of infrastructure is required to support not only big data management and analysis but also knowledge discovery, sharing and management? What is the relationship between big data and science paradigm? What is the nature and fundamental challenge of big data computing? A multi-dimensional perspective is presented toward a methodology of big data computing.Comment: 59 page

    The RQ-Tech Methodology: A New Paradigm for Conceptualizing Strategic Enterprise Architectures

    Get PDF
    The purpose of this research is to develop and apply a system-theoretic based methodology and corresponding model for Enterprise Architecture development. Enterprise Architectures can assist managers by illustrating the systemic relationships within an organization and the impact changes to the organization could make. Unfortunately, today\u27s modeling practices are proprietary, time-consuming, and generally ineffective as tools for understanding the consequences of strategic-level planning decisions across all levels of the enterprise. This research supports the conclusion that system-specific solutions produce islands of technology and can be prevented by employing better enterprise change planning. This research combined the practice of Enterprise Architectures with a modern perspective grounded in Systems Theory and the theory regarding the computer science-oriented Semantic Web to present a distinctive methodology for developing models. A review of literature in all three areas provided an illustration of the overlap common to all three domains. It provided support for critical thinking concerning how to enrich the Enterprise Architecture practice. This research was conducted to answer to two primary questions. The first research question investigated the most significant factors to consider when translating authoritative text and rich pictures into semantic models. The second research question qualitatively measured the extent to which models aligned to important organizational guidance are useful for representing the organization as a whole. Reusable Quality Technical Architectures (RQ-Tech) is the methodology developed from this research. It demonstrates that a complex system of systems organization that must creatively respond to a variety of events can be holistically represented using a dynamic model. RQ-Tech techniques provide ways to map and link the multitudes of scope-level authoritative business documents so that together they can effectively represent the nature and essence of the organization as one organic structure. The marriage of authorized enterprise documentation and the Semantic Web produces a model of the holistic enterprise. This model had previously only been experienced at a tacit level by those exceptionally well-trained in the depth and breadth of organizational culture, supporting laws, policies and related publications. This research effort provides the vision that encourages a paradigm shift away from the mechanistic approach toward organizational change to analogy of a socially connected, interdependent enterprise. New horizons for using the common language of the Semantic Web to capture an understanding of the many interactive systems of the enterprise are substantiated. The research concludes with identification of future research themes prompted by this investigation

    Proceedings of the 2004 ONR Decision-Support Workshop Series: Interoperability

    Get PDF
    In August of 1998 the Collaborative Agent Design Research Center (CADRC) of the California Polytechnic State University in San Luis Obispo (Cal Poly), approached Dr. Phillip Abraham of the Office of Naval Research (ONR) with the proposal for an annual workshop focusing on emerging concepts in decision-support systems for military applications. The proposal was considered timely by the ONR Logistics Program Office for at least two reasons. First, rapid advances in information systems technology over the past decade had produced distributed collaborative computer-assistance capabilities with profound potential for providing meaningful support to military decision makers. Indeed, some systems based on these new capabilities such as the Integrated Marine Multi-Agent Command and Control System (IMMACCS) and the Integrated Computerized Deployment System (ICODES) had already reached the field-testing and final product stages, respectively. Second, over the past two decades the US Navy and Marine Corps had been increasingly challenged by missions demanding the rapid deployment of forces into hostile or devastate dterritories with minimum or non-existent indigenous support capabilities. Under these conditions Marine Corps forces had to rely mostly, if not entirely, on sea-based support and sustainment operations. Particularly today, operational strategies such as Operational Maneuver From The Sea (OMFTS) and Sea To Objective Maneuver (STOM) are very much in need of intelligent, near real-time and adaptive decision-support tools to assist military commanders and their staff under conditions of rapid change and overwhelming data loads. In the light of these developments the Logistics Program Office of ONR considered it timely to provide an annual forum for the interchange of ideas, needs and concepts that would address the decision-support requirements and opportunities in combined Navy and Marine Corps sea-based warfare and humanitarian relief operations. The first ONR Workshop was held April 20-22, 1999 at the Embassy Suites Hotel in San Luis Obispo, California. It focused on advances in technology with particular emphasis on an emerging family of powerful computer-based tools, and concluded that the most able members of this family of tools appear to be computer-based agents that are capable of communicating within a virtual environment of the real world. From 2001 onward the venue of the Workshop moved from the West Coast to Washington, and in 2003 the sponsorship was taken over by ONR’s Littoral Combat/Power Projection (FNC) Program Office (Program Manager: Mr. Barry Blumenthal). Themes and keynote speakers of past Workshops have included: 1999: ‘Collaborative Decision Making Tools’ Vadm Jerry Tuttle (USN Ret.); LtGen Paul Van Riper (USMC Ret.);Radm Leland Kollmorgen (USN Ret.); and, Dr. Gary Klein (KleinAssociates) 2000: ‘The Human-Computer Partnership in Decision-Support’ Dr. Ronald DeMarco (Associate Technical Director, ONR); Radm CharlesMunns; Col Robert Schmidle; and, Col Ray Cole (USMC Ret.) 2001: ‘Continuing the Revolution in Military Affairs’ Mr. Andrew Marshall (Director, Office of Net Assessment, OSD); and,Radm Jay M. Cohen (Chief of Naval Research, ONR) 2002: ‘Transformation ... ’ Vadm Jerry Tuttle (USN Ret.); and, Steve Cooper (CIO, Office ofHomeland Security) 2003: ‘Developing the New Infostructure’ Richard P. Lee (Assistant Deputy Under Secretary, OSD); and, MichaelO’Neil (Boeing) 2004: ‘Interoperability’ MajGen Bradley M. Lott (USMC), Deputy Commanding General, Marine Corps Combat Development Command; Donald Diggs, Director, C2 Policy, OASD (NII

    Agents for educational games and simulations

    Get PDF
    This book consists mainly of revised papers that were presented at the Agents for Educational Games and Simulation (AEGS) workshop held on May 2, 2011, as part of the Autonomous Agents and MultiAgent Systems (AAMAS) conference in Taipei, Taiwan. The 12 full papers presented were carefully reviewed and selected from various submissions. The papers are organized topical sections on middleware applications, dialogues and learning, adaption and convergence, and agent applications

    Image annotation with Photocopain

    Get PDF
    Photo annotation is a resource-intensive task, yet is increasingly essential as image archives and personal photo collections grow in size. There is an inherent conflict in the process of describing and archiving personal experiences, because casual users are generally unwilling to expend large amounts of effort on creating the annotations which are required to organise their collections so that they can make best use of them. This paper describes the Photocopain system, a semi-automatic image annotation system which combines information about the context in which a photograph was captured with information from other readily available sources in order to generate outline annotations for that photograph that the user may further extend or amend
    corecore