358,640 research outputs found

    Crosscutting, what is and what is not? A Formal definition based on a Crosscutting Pattern

    Get PDF
    Crosscutting is usually described in terms of scattering and tangling. However, the distinction between these concepts is vague, which could lead to ambiguous statements. Sometimes, precise definitions are required, e.g. for the formal identification of crosscutting concerns. We propose a conceptual framework for formalizing these concepts based on a crosscutting pattern that shows the mapping between elements at two levels, e.g. concerns and representations of concerns. The definitions of the concepts are formalized in terms of linear algebra, and visualized with matrices and matrix operations. In this way, crosscutting can be clearly distinguished from scattering and tangling. Using linear algebra, we demonstrate that our definition generalizes other definitions of crosscutting as described by Masuhara & Kiczales [21] and Tonella and Ceccato [28]. The framework can be applied across several refinement levels assuring traceability of crosscutting concerns. Usability of the framework is illustrated by means of applying it to several areas such as change impact analysis, identification of crosscutting at early phases of software development and in the area of model driven software development

    Untenable nonstationarity: An assessment of the fitness for purpose of trend tests in hydrology

    Get PDF
    The detection and attribution of long-term patterns in hydrological time series have been important research topics for decades. A significant portion of the literature regards such patterns as ‘deterministic components’ or ‘trends’ even though the complexity of hydrological systems does not allow easy deterministic explanations and attributions. Consequently, trend estimation techniques have been developed to make and justify statements about tendencies in the historical data, which are often used to predict future events. Testing trend hypothesis on observed time series is widespread in the hydro-meteorological literature mainly due to the interest in detecting consequences of human activities on the hydrological cycle. This analysis usually relies on the application of some null hypothesis significance tests (NHSTs) for slowly-varying and/or abrupt changes, such as Mann-Kendall, Pettitt, or similar, to summary statistics of hydrological time series (e.g., annual averages, maxima, minima, etc.). However, the reliability of this application has seldom been explored in detail. This paper discusses misuse, misinterpretation, and logical flaws of NHST for trends in the analysis of hydrological data from three different points of view: historic-logical, semantic-epistemological, and practical. Based on a review of NHST rationale, and basic statistical definitions of stationarity, nonstationarity, and ergodicity, we show that even if the empirical estimation of trends in hydrological time series is always feasible from a numerical point of view, it is uninformative and does not allow the inference of nonstationarity without assuming a priori additional information on the underlying stochastic process, according to deductive reasoning. This prevents the use of trend NHST outcomes to support nonstationary frequency analysis and modeling. We also show that the correlation structures characterizing hydrological time series might easily be underestimated, further compromising the attempt to draw conclusions about trends spanning the period of records. Moreover, even though adjusting procedures accounting for correlation have been developed, some of them are insufficient or are applied only to some tests, while some others are theoretically flawed but still widely applied. In particular, using 250 unimpacted stream flow time series across the conterminous United States (CONUS), we show that the test results can dramatically change if the sequences of annual values are reproduced starting from daily stream flow records, whose larger sizes enable a more reliable assessment of the correlation structures

    DDoS Attacks with Randomized Traffic Innovation: Botnet Identification Challenges and Strategies

    Full text link
    Distributed Denial-of-Service (DDoS) attacks are usually launched through the botnetbotnet, an "army" of compromised nodes hidden in the network. Inferential tools for DDoS mitigation should accordingly enable an early and reliable discrimination of the normal users from the compromised ones. Unfortunately, the recent emergence of attacks performed at the application layer has multiplied the number of possibilities that a botnet can exploit to conceal its malicious activities. New challenges arise, which cannot be addressed by simply borrowing the tools that have been successfully applied so far to earlier DDoS paradigms. In this work, we offer basically three contributions: i)i) we introduce an abstract model for the aforementioned class of attacks, where the botnet emulates normal traffic by continually learning admissible patterns from the environment; ii)ii) we devise an inference algorithm that is shown to provide a consistent (i.e., converging to the true solution as time progresses) estimate of the botnet possibly hidden in the network; and iii)iii) we verify the validity of the proposed inferential strategy over realreal network traces.Comment: Submitted for publicatio

    A slot antenna array with low mutual coupling for use on small mobile terminals

    Get PDF

    Performance analysis of a hardware accelerator of dependence management for taskbased dataflow programming models

    Get PDF
    Along with the popularity of multicore and manycore, task-based dataflow programming models obtain great attention for being able to extract high parallelism from applications without exposing the complexity to programmers. One of these pioneers is the OpenMP Superscalar (OmpSs). By implementing dynamic task dependence analysis, dataflow scheduling and out-of-order execution in runtime, OmpSs achieves high performance using coarse and medium granularity tasks. In theory, for the same application, the more parallel tasks can be exposed, the higher possible speedup can be achieved. Yet this factor is limited by task granularity, up to a point where the runtime overhead outweighs the performance increase and slows down the application. To overcome this handicap, Picos was proposed to support task-based dataflow programming models like OmpSs as a fast hardware accelerator for fine-grained task and dependence management, and a simulator was developed to perform design space exploration. This paper presents the very first functional hardware prototype inspired by Picos. An embedded system based on a Zynq 7000 All-Programmable SoC is developed to study its capabilities and possible bottlenecks. Initial scalability and hardware consumption studies of different Picos designs are performed to find the one with the highest performance and lowest hardware cost. A further thorough performance study is employed on both the prototype with the most balanced configuration and the OmpSs software-only alternative. Results show that our OmpSs runtime hardware support significantly outperforms the software-only implementation currently available in the runtime system for finegrained tasks.This work is supported by the Spanish Government through Programa Severo Ochoa (SEV-2015-0493), by the Spanish Ministry of Science and Technology through TIN2015-65316-P project, by the Generalitat de Catalunya (contracts 2014-SGR-1051 and 2014-SGR-1272) and by the European Research Council RoMoL Grant Agreement number 321253. We also thank the Xilinx University Program for its hardware and software donations.Peer ReviewedPostprint (published version

    Identification of a temperature dependent FitzHugh-Nagumo model for the Belousov-Zhabotinskii reaction

    Get PDF
    This paper describes the identification of a temperature dependent FitzHugh-Nagumo model directly from experimental observations with controlled inputs. By studying the steady states and the trajectory of the phase of the variables, the stability of the model is analysed and a rule to generate oscillation waves is proposed. The dependence of the oscillation frequency and propagation speed on the model parameters is then investigated to seek the appropriate control variables, which then become functions of temperature in the identified model. The results show that the proposed approach can provide a good representation of the dynamics of the oscillatory behaviour of a BZ reaction

    A Sensitivity Matrix Methodology for Inverse Problem Formulation

    Get PDF
    We propose an algorithm to select parameter subset combinations that can be estimated using an ordinary least-squares (OLS) inverse problem formulation with a given data set. First, the algorithm selects the parameter combinations that correspond to sensitivity matrices with full rank. Second, the algorithm involves uncertainty quantification by using the inverse of the Fisher Information Matrix. Nominal values of parameters are used to construct synthetic data sets, and explore the effects of removing certain parameters from those to be estimated using OLS procedures. We quantify these effects in a score for a vector parameter defined using the norm of the vector of standard errors for components of estimates divided by the estimates. In some cases the method leads to reduction of the standard error for a parameter to less than 1% of the estimate
    • 

    corecore