9,129 research outputs found

    The Loss in Efficiency from Using Grouped Data

    Get PDF
    We derive the efficiency loss from using grouped data to estimate coefficients of variables that vary across groups but not individuals within a group (e.g., state unemployment rates) when micro data are unavailable on the dependent variable. We present an empirical example of our theoretical results, and show that the efficiency loss in this application is small.grouped data, relative efficiency

    21st Century Simulation: Exploiting High Performance Computing and Data Analysis

    Get PDF
    This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in computing power. This has been characterized as a ten-year lead over the use of single-processor computers. Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power. JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants, and to understand non-linear, asymmetric warfare. These requirements stretch both current computational techniques and data analysis methodologies. In this paper, documented examples and potential solutions will be advanced. The authors discuss the paths to successful implementation based on their experience. Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch, database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses. The modeling and simulation community has significant potential to provide more opportunities for training and analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights, for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses. The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success

    Belgium

    Get PDF

    Belgium

    Get PDF

    Beyond Control-Flow: Extending Business Process Configuration to Roles and Objects

    Get PDF
    A configurable process model is an integrated representation of multiple variants of a business process. It is designed to be individualized to meet a particular set of requirements. As such, configurable process models promote systematic reuse of proven or common practices. Existing notations for configurable process modeling focus on capturing tasks and control-flow dependencies, neglecting equally important aspects of business processes such as data flow, material flow and resource management. This paper fills this gap by proposing an integrated meta-model for configurable processes with advanced features for capturing resources involved in the performance of tasks (through task-role associations) as well as flow of data and physical artifacts (through task-object associations). Although embodied as an extension of a popular process modeling notation, namely EPC, the meta-model is defined in an abstract and formal manner to make it applicable to other notations
    corecore