4,391 research outputs found

    CPAS/CCM experiences: Perspectives for AI/ES research in accounting

    Get PDF
    https://egrove.olemiss.edu/dl_proceedings/1111/thumbnail.jp

    On the calculation of time alignment errors in data management platforms for distribution grid data

    Get PDF
    The operation and planning of distribution grids require the joint processing of measurements from different grid locations. Since measurement devices in low- and medium-voltage grids lack precise clock synchronization, it is important for data management platforms of distribution system operators to be able to account for the impact of nonideal clocks on measurement data. This paper formally introduces a metric termed Additive Alignment Error to capture the impact of misaligned averaging intervals of electrical measurements. A trace-driven approach for retrieval of this metric would be computationally costly for measurement devices, and therefore, it requires an online estimation procedure in the data collection platform. To overcome the need of transmission of high-resolution measurement data, this paper proposes and assesses an extension of a Markov-modulated process to model electrical traces, from which a closed-form matrix analytic formula for the Additive Alignment Error is derived. A trace-driven assessment confirms the accuracy of the model-based approach. In addition, the paper describes practical settings where the model can be utilized in data management platforms with significant reductions in computational demands on measurement devices

    Constructing a no-reference H.264/AVC bitstream-based video quality metric using genetic programming-based symbolic regression

    Get PDF
    In order to ensure optimal quality of experience toward end users during video streaming, automatic video quality assessment becomes an important field-of-interest to video service providers. Objective video quality metrics try to estimate perceived quality with high accuracy and in an automated manner. In traditional approaches, these metrics model the complex properties of the human visual system. More recently, however, it has been shown that machine learning approaches can also yield competitive results. In this paper, we present a novel no-reference bitstream-based objective video quality metric that is constructed by genetic programming-based symbolic regression. A key benefit of this approach is that it calculates reliable white-box models that allow us to determine the importance of the parameters. Additionally, these models can provide human insight into the underlying principles of subjective video quality assessment. Numerical results show that perceived quality can be modeled with high accuracy using only parameters extracted from the received video bitstream

    Acquiring data designs from existing data-intensive programs

    Get PDF
    The problem area addressed in this thesis is extraction of a data design from existing data intensive program code. The purpose of this is to help a software maintainer to understand a software system more easily because a view of a software system at a high abstraction level can be obtained. Acquiring a data design from existing data intensive program code is an important part of reverse engineering in software maintenance. A large proportion of software systems currently needing maintenance is data intensive. The research results in this thesis can be directly used in a reverse engineering tool. A method has been developed for acquiring data designs from existing data intensive programs, COBOL programs in particular. Program transformation is used as the main tool. Abstraction techniques and the method of crossing levels of abstraction are also studied for acquiring data designs. A prototype system has been implemented based on the method developed. This involved implementing a number of program transformations for data abstraction, and thus contributing to the production of a tool. Several case studies, including one case study using a real program with 7000 Hues of source code, are presented. The experiment results show that the Entity-Relationship Attribute Diagrams derived from the prototype can represent the data designs of the original data intensive programs. The original contribution of the thesis is that the approach presented in this thesis can identify and extract data relationships from the existing code by combining analysis of data with analysis of code. The approach is believed to be able to provide better capabilities than other work in the field. The method has indicated that acquiring a data design from existing data intensive program code by program transformation with human assistance is an effective method in software maintenance. Future work is suggested at the end of the thesis including extending the method to build an industrial strength tool

    Semantic valence modeling: emotion recognition and affective states in context-aware systems

    Get PDF
    (c) 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Defining and describing a context requires knowledge (contextual information), while research is addressing a wider range of potential contextual information in a diverse range of domains the diversity of potential contextual information has not been effectively addressed. This paper considers the implementation of context and identifies emotion (more accurately emotional response) as a factor in the personalization of services as under-represented in the literature. We propose semantic valence modeling implemented in fuzzy rule-based systems as a potential solution to the implementation of emotional responses in context-aware systems. It is concluded that the effective implementation of emotional responses based on the posited approach will improve the accuracy of personalized service provision and additionally offers the potential to improve the levels of computational intelligence in context-aware domains and systems.Peer ReviewedPostprint (author's final draft

    Reliability Assessment and Optimization of Water Distribution Systems Explicitly Considering Isolation Valve Locations

    Get PDF
    Water distribution systems have changed the landscape of communities through two services: 1) providing water supply for domestic and industrial use, and 2) providing water required to fight fires. However, a substantial portion of the water infrastructure in the country, as many of other public assets built over 50 years ago, are now reaching the end of their useful life; which combined with rapid growth and changes in demographics have placed water distribution pipe networks at a state that requires revitalization. The aging infrastructure along with the growing threat of natural and man-made disruptions have led water utilities to place a greater emphasis on developing better strategies to minimize the impact on the system users when a failure event occurs (i.e., improve the reliability of the system). The proposed segment-based analysis considers valve location to estimate the number of pipes taken out of service to seclude the initial pipe break or element failure. The objective of the assessment is to identify critical segments (i.e., smallest set of pipes that can be secluded using the closest isolation valves) and critical valves in a set of real water distribution networks. The critical elements, the segments or valves that when taken out of service cause the greatest reduction in the supply delivered and the level of service provided, are identified using the performance metrics based on: loss of connectivity, and the failure to meet hydraulic and fire protection requirements. This type of assessment seeks to be a simple method to provide information on critical elements that considers the role of isolation valves, thus offering a more realistic view of the effects of a breakdown. This framework is then used to define valve locations that could offer the improvement in reliability for a given capital investment
    • …
    corecore