1,852 research outputs found

    Modelling Self-similar Traffic Of Multiservice Networks

    Get PDF
    Simulation modelling is carried out, which allows adequate describing the traffic of multiservice networks with the commutation of packets with the characteristic of burstiness. One of the most effective methods for studying the traffic of telecommunications systems is computer simulation modelling. By using the theory of queuing systems (QS), computer simulation modelling of packet flows (traffic) in modern multi-service networks is performed as a random self-similar process. Distribution laws such as exponential, Poisson and normal-logarithmic distributions, Pareto and Weibull distributions have been considered.The distribution of time intervals between arrivals of packages and the service duration of service of packages at different system loads has been studied. The research results show that the distribution function of time intervals between packet arrivals and the service duration of packages is in good agreement with the Pareto and Weibull distributions, but in most cases the Pareto distribution prevails.The queuing systems with the queues M/Pa/1 and Pa/M/1 has been studied, and the fractality of the intervals of requests arriving have been compared by the properties of the estimates of the system load and the service duration. It has been found out that in the system Pa/M/1, with the parameter of the form a> 2, the fractality of the intervals of requests arriving does not affect the average waiting time and load factor. However, when ≤2, as in the M/Pa/1 system, both considered statistical estimates differ.The application of adequate mathematical models of traffic allows to correctly assess the characteristics of the quality of service (QoS) of the network

    Multilevel Contracts for Trusted Components

    Full text link
    This article contributes to the design and the verification of trusted components and services. The contracts are declined at several levels to cover then different facets, such as component consistency, compatibility or correctness. The article introduces multilevel contracts and a design+verification process for handling and analysing these contracts in component models. The approach is implemented with the COSTO platform that supports the Kmelia component model. A case study illustrates the overall approach.Comment: In Proceedings WCSI 2010, arXiv:1010.233

    Book of Abstracts of the Sixth SIAM Workshop on Combinatorial Scientific Computing

    Get PDF
    Book of Abstracts of CSC14 edited by Bora UçarInternational audienceThe Sixth SIAM Workshop on Combinatorial Scientific Computing, CSC14, was organized at the Ecole Normale Supérieure de Lyon, France on 21st to 23rd July, 2014. This two and a half day event marked the sixth in a series that started ten years ago in San Francisco, USA. The CSC14 Workshop's focus was on combinatorial mathematics and algorithms in high performance computing, broadly interpreted. The workshop featured three invited talks, 27 contributed talks and eight poster presentations. All three invited talks were focused on two interesting fields of research specifically: randomized algorithms for numerical linear algebra and network analysis. The contributed talks and the posters targeted modeling, analysis, bisection, clustering, and partitioning of graphs, applied in the context of networks, sparse matrix factorizations, iterative solvers, fast multi-pole methods, automatic differentiation, high-performance computing, and linear programming. The workshop was held at the premises of the LIP laboratory of ENS Lyon and was generously supported by the LABEX MILYON (ANR-10-LABX-0070, Université de Lyon, within the program ''Investissements d'Avenir'' ANR-11-IDEX-0007 operated by the French National Research Agency), and by SIAM

    05081 Abstracts Collection -- Foundations of Global Computing

    Get PDF
    From 20.02.05 to 25.02.05, the Dagstuhl Seminar 05081 on ``Foundations of Global Computing\u27\u27 was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    A new method to measure complexity in binary or weighted networks and applications to functional connectivity in the human brain

    Get PDF
    BACKGROUND: Networks or graphs play an important role in the biological sciences. Protein interaction networks and metabolic networks support the understanding of basic cellular mechanisms. In the human brain, networks of functional or structural connectivity model the information-flow between cortex regions. In this context, measures of network properties are needed. We propose a new measure, Ndim, estimating the complexity of arbitrary networks. This measure is based on a fractal dimension, which is similar to recently introduced box-covering dimensions. However, box-covering dimensions are only applicable to fractal networks. The construction of these network-dimensions relies on concepts proposed to measure fractality or complexity of irregular sets in [Formula: see text]. RESULTS: The network measure Ndim grows with the proliferation of increasing network connectivity and is essentially determined by the cardinality of a maximum k-clique, where k is the characteristic path length of the network. Numerical applications to lattice-graphs and to fractal and non-fractal graph models, together with formal proofs show, that Ndim estimates a dimension of complexity for arbitrary graphs. Box-covering dimensions for fractal graphs rely on a linear log-log plot of minimum numbers of covering subgraph boxes versus the box sizes. We demonstrate the affinity between Ndim and the fractal box-covering dimensions but also that Ndim extends the concept of a fractal dimension to networks with non-linear log-log plots. Comparisons of Ndim with topological measures of complexity (cost and efficiency) show that Ndim has larger informative power. Three different methods to apply Ndim to weighted networks are finally presented and exemplified by comparisons of functional brain connectivity of healthy and depressed subjects. CONCLUSION: We introduce a new measure of complexity for networks. We show that Ndim has the properties of a dimension and overcomes several limitations of presently used topological and fractal complexity-measures. It allows the comparison of the complexity of networks of different type, e.g., between fractal graphs characterized by hub repulsion and small world graphs with strong hub attraction. The large informative power and a convenient computational CPU-time for moderately sized networks may make Ndim a valuable tool for the analysis of biological networks
    • …
    corecore