4,930 research outputs found

    Design of a simulation environment for laboratory management by robot organizations

    Get PDF
    This paper describes the basic concepts needed for a simulation environment capable of supporting the design of robot organizations for managing chemical, or similar, laboratories on the planned U.S. Space Station. The environment should facilitate a thorough study of the problems to be encountered in assigning the responsibility of managing a non-life-critical, but mission valuable, process to an organized group of robots. In the first phase of the work, we seek to employ the simulation environment to develop robot cognitive systems and strategies for effective multi-robot management of chemical experiments. Later phases will explore human-robot interaction and development of robot autonomy

    A methodical approach to performance measurement experiments : measure and measurement specification

    Get PDF
    This report describes a methodical approach to performance measurement experiments. This approach gives a blueprint for the whole trajectory from the notion of performance measures and how to define them via planning, instrumentation and execution of the experiments to interpretation of the results. The first stage of the approach, Measurement Initialisation, has been worked out completely. It is shown that a well-defined system description allows a procedural approach to defining performance measures and to identifying parameters that might affect it. For the second stage of the approach, Measurement Planning, concepts are defined that enable a clear experiment description or specification. It is highlighted what actually is being measured when executing an experiment. A brief example that illustrates the value of the method and a comparison with an existing method - that of Jain - complete this report

    Auditing in common computer environments; Auditing procedure study;

    Get PDF
    https://egrove.olemiss.edu/aicpa_guides/1039/thumbnail.jp

    21st Century Simulation: Exploiting High Performance Computing and Data Analysis

    Get PDF
    This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in computing power. This has been characterized as a ten-year lead over the use of single-processor computers. Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power. JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants, and to understand non-linear, asymmetric warfare. These requirements stretch both current computational techniques and data analysis methodologies. In this paper, documented examples and potential solutions will be advanced. The authors discuss the paths to successful implementation based on their experience. Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch, database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses. The modeling and simulation community has significant potential to provide more opportunities for training and analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights, for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses. The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success

    Data warehouse automation trick or treat?

    Get PDF
    Data warehousing systems have been around for 25 years playing a crucial role in collecting data and transforming that data into value, allowing users to make decisions based on informed business facts. It is widely accepted that a data warehouse is a critical component to a data-driven enterprise, and it becomes part of the organisation’s information systems strategy, with a significant impact on the business. However, after 25 years, building a Data Warehouse is still painful, they are too time-consuming, too expensive and too difficult to change after deployment. Data Warehouse Automation appears with the promise to address the limitations of traditional approaches, turning the data warehouse development from a prolonged effort into an agile one, with gains in efficiency and effectiveness in data warehousing processes. So, is Data Warehouse Automation a Trick or Treat? To answer this question, a case study of a data warehousing architecture using a data warehouse automation tool, called WhereScape, was developed. Also, a survey was made to organisations that are using data warehouse automation tools, in order to understand their motivation in the adoption of this kind of tools in their data warehousing systems. Based on the results of the survey and on the case study, automation in the data warehouses building process is necessary to deliver data warehouse systems faster, and a solution to consider when modernize data warehouse architectures as a way to achieve results faster, keeping costs controlled and reduce risk. Data Warehouse Automation definitely may be a Treat.Os sistemas de armazenamento de dados existem há 25 anos, desempenhando um papel crucial na recolha de dados e na transformação desses dados em valor, permitindo que os utilizadores tomem decisões com base em fatos. É amplamente aceite, que um data warehouse é um componente crítico para uma empresa orientada a dados e se torna parte da estratégia de sistemas de informação da organização, com um impacto significativo nos negócios. No entanto, após 25 anos, a construção de um Data Warehouse ainda é uma tarefa penosa, demora muito tempo, é cara e difícil de mudar após a sua conclusão. A automação de Data Warehouse aparece com a promessa de endereçar as limitações das abordagens tradicionais, transformando o desenvolvimento da data warehouse de um esforço prolongado em um esforço ágil, com ganhos de eficiência e eficácia. Será, a automação de Data Warehouse uma doçura ou travessura? Foi desenvolvido um estudo de caso de uma arquitetura de data warehousing usando uma ferramenta de automação, designada WhereScape. Foi também conduzido um questionário a organizações que utilizam ferramentas de automação de data warehouse, para entender sua motivação na adoção deste tipo de ferramentas. Com base nos resultados da pesquisa e no estudo de caso, a automação no processo de construção de data warehouses, é necessária para uma maior agilidade destes sistemas e uma solução a considerar na modernização destas arquiteturas, pois permitem obter resultados mais rapidamente, mantendo os custos controlados e reduzindo o risco. A automação de data warehouse pode bem vir a ser uma “doçura”

    The Integration of Technology Theory and Business Analysis: A Pedagogical Framework for the Undergraduate MIS Course in Data Communications and Networking

    Get PDF
    One of the fundamental challenges of information systems education within a school of business is to integrate technology theory and business analysis. Information systems as an academic discipline must contain a theoretical component, for theory development is, after all, the essence of academia. However, model curricula for IS education (e.g., IS’2002.6) have been incorporating a growing number of applied, hands-on topics. This is especially true of the undergraduate course in data communications and networking (DCN). While we do not negate the value of a lab experience in network configuration, we posit that applied DCN topics can be effectively taught via the business case method as well. Toward this end, our article proposes a pedagogical framework. This type of framework can also be used in other Information Systems courses to maintain an appropriate balance between technology theory and business analysis

    A modular multipurpose, parameter centered electronic health record architecture

    Get PDF
    Health Information Technology is playing a key role in healthcare. Specifically, the use of electronic health records has been found to bring about most significant improvements in healthcare quality, mainly as relates to patient management, healthcare delivery and research support. Health record systems adoption has been promoted in many countries to support efficient, high quality integrated healthcare. The objective of this work is the implementation of an Electronic Health Record system based on a relational database. The system architecture is modular and based on the concentration of specific pathology related parameters in one module, therefore the system can be easily applied to different pathologies. Several examples of its application are described. It is intended to extend the system integrating genomic data

    Smart Geographic object: Toward a new understanding of GIS Technology in Ubiquitous Computing

    Get PDF
    One of the fundamental aspects of ubiquitous computing is the instrumentation of the real world by smart devices. This instrumentation constitutes an opportunity to rethink the interactions between human beings and their environment on the one hand, and between the components of this environment on the other. In this paper we discuss what this understanding of ubiquitous computing can bring to geographic science and particularly to GIS technology. Our main idea is the instrumentation of the geographic environment through the instrumentation of geographic objects composing it. And then investigate how this instrumentation can meet the current limitations of GIS technology, and offers a new stage of rapprochement between the earth and its abstraction. As result, the current research work proposes a new concept we named Smart Geographic Object SGO. The latter is a convergence point between the smart objects and geographic objects, two concepts appertaining respectively to

    LAN Configuration and Analysis: Projects for the Data Communications and Networking Course

    Get PDF
    We implemented two local area network (LAN) projects in our introductory data communications and networking course. The first project required students to develop a LAN from scratch for a small imaginary organization. The second project required student groups to analyze a LAN for a real world small organization. By allowing students to apply what they learn in class to real world situations, the projects bridge the gap between technical concepts and business applications
    corecore