13,381 research outputs found

    MICSIM : Concept, Developments and Applications of a PC-Microsimulation Model for Research and Teaching

    Get PDF
    It is the growing societal interest about the individual and its behaviour in our and 'modern' societies which is asking for microanalyses about the individual situation. In order to allow these microanalyses on a quantitative and empirically based level microsimulation models were developed and increasingly used for economic and social policy impact analyses. Though microsimulation is known and applied (mainly by experts), an easy to use and powerful PC microsimulation model is hard to find. The overall aim of this study and of MICSIM - A PC Microsimulation Model is to describe and offer such a user-friendly and powerful general microsimulation model for (almost) any PC, to support the impact microanalyses both in applied research and teaching. Above all, MICSIM is a general microdata handler for a wide range of typical microanalysis requirements. This paper presents the concept, developments and applications of MICSIM. After some brief remarks on microsimulation characteristics in general, the concept and substantive domains of MICSIM: the simulation, the adjustment and aging, and the evaluation of microdata, are described by its mode of operation in principle. The realisations and developments of MICSIM then are portrayed by the different versions of the computer program. Some MICSIM applications and experiences in research and teaching are following with concluding remarks.Economic and Social Policy Analyses, Microsimulation (dynamic and static), Simulation, Adjustment and Evaluation of Microdata, PC Computer Program for Microanalyses in General

    Optical fibre local area networks

    Get PDF

    Applied business analytics approach to IT projects – Methodological framework

    Full text link
    The design and implementation of a big data project differs from a typical business intelligence project that might be presented concurrently within the same organization. A big data initiative typically triggers a large scale IT project that is expected to deliver the desired outcomes. The industry has identified two major methodologies for running a data centric project, in particular SEMMA (Sample, Explore, Modify, Model and Assess) and CRISP-DM (Cross Industry Standard Process for Data Mining). More general, the professional organizations PMI (Project Management Institute) and IIBA (International Institute of Business Analysis) have defined their methods for project management and business analysis based on the best current industry practices. However, big data projects place new challenges that are not considered by the existing methodologies. The building of end-to-end big data analytical solution for optimization of the supply chain, pricing and promotion, product launch, shop potential and customer value is facing both business and technical challenges. The most common business challenges are unclear and/or poorly defined business cases; irrelevant data; poor data quality; overlooked data granularity; improper contextualization of data; unprepared or bad prepared data; non-meaningful results; lack of skill set. Some of the technical challenges are related to lag of resources and technology limitations; availability of data sources; storage difficulties; security issues; performance problems; little flexibility; and ineffective DevOps. This paper discusses an applied business analytics approach to IT projects and addresses the above-described aspects. The authors present their work on research and development of new methodological framework and analytical instruments applicable in both business endeavors, and educational initiatives, targeting big data. The proposed framework is based on proprietary methodology and advanced analytics tools. It is focused on the development and the implementation of practical solutions for project managers, business analysts, IT practitioners and Business/Data Analytics students. Under discussion are also the necessary skills and knowledge for the successful big data business analyst, and some of the main organizational and operational aspects of the big data projects, including the continuous model deployment

    Performance Results and Characteristics of Adopters of Genetically Engineered Soybeans in Delaware

    Get PDF
    Genetically engineered (GE) soybeans first became available to farmers in 1996. Despite the common questions regarding any new crop technology, the new seeds were rapidly adopted. This study examines the characteristics of adopters, as well as yield and weed control cost changes, using survey results from Delaware farmers at the start of the 2000 season. Duration analysis reveals that earlier-adopting farmers had larger farms and tended to use computers for financial management, while regression analysis shows significantly lower weed control costs and, to a lesser extent, higher yields for GE soybeans.Crop Production/Industries,

    Server Structure Proposal and Automatic Verification Technology on IaaS Cloud of Plural Type Servers

    Get PDF
    In this paper, we propose a server structure proposal and automatic performance verification technology which proposes and verifies an appropriate server structure on Infrastructure as a Service (IaaS) cloud with baremetal servers, container based virtual servers and virtual machines. Recently, cloud services have been progressed and providers provide not only virtual machines but also baremetal servers and container based virtual servers. However, users need to design an appropriate server structure for their requirements based on 3 types quantitative performances and users need much technical knowledge to optimize their system performances. Therefore, we study a technology which satisfies users' performance requirements on these 3 types IaaS cloud. Firstly, we measure performances of a baremetal server, Docker containers, KVM (Kernel based Virtual Machine) virtual machines on OpenStack with virtual server number changing. Secondly, we propose a server structure proposal technology based on the measured quantitative data. A server structure proposal technology receives an abstract template of OpenStack Heat and function/performance requirements and then creates a concrete template with server specification information. Thirdly, we propose an automatic performance verification technology which executes necessary performance tests automatically on provisioned user environments according to the template.Comment: Evaluations of server structure proposal were insufficient in section
    • …
    corecore