222 research outputs found

    Methods for performance evaluation of VBR video traffic models

    Get PDF
    Abstract-Models for predicting the performance of multiplexed variable bit rate video sources are important for engineering a network. However, models of a single source are also important for parameter negotiations and call admittance algorithms. In this paper we propose to model a single video source as a Markov renewal process whose states represent different bit rates. We also propose two novel goodness-of-fit metrics which are directly related to the specific performance aspects that we want to predict from the model. The first is a leaky bucket contour plot which can be used to quantify the burstiness of any traffic type. The second measure applies only to video traffic and measures how well the model can predict the compressed video quality. I. INTTtODUCTtON I T is well recognized that the viability of B-ISDN/ATM depends on the development of effective and implementable congestion control schemes. While many frameworks and techniques are under discussion (see, e.g., [l]), at least two capabilities have been agreed to as necessary in any framework that might arise.) The first is a comection admission control (CAC) by which the network will decide to accept or reject a new connection based on a set of agreed to traffic descriptors and on available resources. Once a connection is accepted, a second necessary control issome form of usage parameter control (UPC) which will insure that connections stay within their negotiated resource parameters. A popular UPC would involve a leaky bucket monitor of traffic entering the system, where traffic deemed as excessive by the monitor could either be dropped or tagged as low priority and allowed to proceed through the network to take advantage of potentially unused resources. Performance modeling is necessary to determine which techniques or set of techniques will be appropriate for eventual implementation in a B-ISDN network. Such models need to take into account traffic characteristics from realistic services that would be carried in a B-ISDN network. In particular, we need traffic models which will accurately represent the statistical nature of very high-speed, bursty services. Two classes of traffic models need to be developed: multiplexed source models and single source models. Although the same traffic model might be used in both cases, some models might be more suitable for one than the other. Multiplexed models will capture the effects of statistically multiplexing bursty sources and will predict to what extent the superposition of bursty streams is "smoothed". These models will be useful in traffic engineering the network (e.g., deciding how many links or virtual paths to put between different locations) and in traffic management (e.g., designing connection admission control algorithms, etc.) Several models have already been proposed in this direction (see, e.g., There are several areas where single source models are useful. They could be used to study what types of traffic descriptors make sense for parameter negotiation with the network at call setup. For example, if leaky bucket monitoring is used as a traffic descriptor, the negotiation might consist of the source specifying what parameters could be used in the leaky bucket for a given connection. Single source models can help in the selection of these parameters. Also, some applications may do some end-to-end rate control to ensure that minimal traffic is lost during periods of network congestion. Source models could be used in testing various rate control algorithms, Finally, these models are also useful in predicting the qualityof-service (QOS) that a particular application might experience during different levels of congestion. In deriving traffic models, we need metrics which can determine how "close" the model is to the actual traffic. Standard statistical measures such as means, variances, and other goodness-of-fit tests may not be appropriate here since they may not be measuring the characteristics of the process that are most important for either predicting the effect of the source on the resources in the network or the performance the source will experience. Instead, the goodness-of-fit metrics need to be directly related to the specific aspects of performance that we want to predict from the model; see e.g., [6]. In this paper, we propose two criteria for judging the appropriateness of a traffic model for bursty services. The first one applies to any high speed bursty data service and the second is specific to a variable-bit-rate (VBR) video application. To illustrate these measures we compare a previous model of VBR video with a new model proposed here. II. MODELING VARIABLE-BIT-RATE VIDEO The data we are modeling was recorded at an actual teleconference meeting. Each scene depicts the head and shoulders of one person, and is 5 rein, or 9000 frames, long. Since each 5 min of video required approximately one week to encode using software, the motivation for developing accurate models with a low computational burden is clear. A typical 10634692i94$04.0

    Устройство для перемещения датчиков в магнитном поле малогабаритного бетатрона

    Get PDF
    Рассматривается возможность увеличения точности измерений характеристик магнитного поля посредством более точной установки датчиков в исследуемой точке

    Just Click Here : A Brief Glance at Absurd Electronic Contracts and the Law Failing to Protect Consumers

    Get PDF
    As e-commerce explodes around the world, consumers’ rights have been left behind. Before the completion of virtually every transaction on the Internet, the onus is placed on consumers to read and agree to an onslaught of terms and conditions. Often hidden in the middle of this extremely lengthy list of terms are massive exemption and limitation of liability clauses that deny consumers most if not all of their rights as “equal” trading partners. The common law principle that all onerous clauses in a contract need to be brought to the attention of the consumer for them to be binding seems to have been lost with the invention of the “click here to agree” button for signing online contracts. As the courts in Canada have not provided clear guidance on this issue thus far, other means must be pursued in order to protect consumers from the near-tyrannical control of unencumbered electronic standard form contracts in e-commerce. This paper will describe the principle of sufficiency of notice as it applies to paper contracts, and then contrast it with the newer jurisprudence that has refused to apply the principle to electronic contracts. The reasons for the refusal will be explored, followed by an examination of why the principle of sufficiency of notice needs to be applied and strengthened to respond to the increasingly onerous provisions hidden in electronic contracts. Finally, some other options for achieving the goal of consumer protection from hidden onerous clauses will be briefly explored. These other options include introducing stiffer consumer protection legislation domestically, the creation of international treaties, developing voluntary standards of contracting, and relying on Internet self-regulation

    L’indagine macrosismica: metodologia, parametri del terremoto, questioni aperte

    Get PDF
    Subito dopo l’evento del 6 aprile 2009, come di consueto è stata realizzata una lunga e complessa indagine macrosismica, promossa dal gruppo operativo QUEST, che ha avuto inizialmente l’obiettivo di delimitare l’area di danneggiamento, a supporto delle attività di pronto intervento della Protezione Civile, e successivamente quello di classificare nel modo più accurato e capillare possibile, gli effetti prodotti dall’evento, particolarmente nelle aree danneggiate. A questo scopo è stata prodotta una stima utilizzando la scala MCS (Sieberg, 1930); in un secondo momento è stata rifinita l’indagine per una cinquantina di località dell’area maggiormente danneggiata (Is MCS>VII), raccogliendo ed elaborando i dati in termini di scala macrosismica EMS98 (Grünthal, 1998). Per la complessità e la dimensione dei problemi affrontati, questo terremoto ha costituito un banco di prova di grande importanza per la macrosismologia italiana. In questo testo viene descritto il lavoro realizzato, discutendo in particolare alcuni aspetti che hanno messo alla prova le metodologie di indagine tradizionali (sistematiche irregolarità degli insediamenti monitorati, forti divergenze degli scenari di danno rispetto a quelli previsti dalle scale, difficile comparabilità con scenari storici, ecc.) e presentandone i risultati, in relazione ai parametri epicentrali che ne risultano e il loro contributo più diretto alla comprensione complessiva della sismicità dell’area

    Non-Hodgkin's lymphoma, obesity and energy homeostasis polymorphisms

    Get PDF
    A population-based case–control study of lymphomas in England collected height and weight details from 699 non-Hodgkin's lymphoma (NHL) cases and 914 controls. Obesity, defined as a body mass index (BMI) over 30 kg m−2 at five years before diagnosis,, was associated with an increased risk of NHL (OR=1.5, 95% CI 1.1–2.1). The excess was most pronounced for diffuse large B-cell lymphoma (OR=1.9, 95% CI 1.3–2.8). Genetic variants in the leptin (LEP 19G>A, LEP −2548G>A) and leptin receptor genes (LEPR 223Q>R), previously shown to modulate NHL risk, as well as a polymorphism in the energy regulatory gene adiponectin (APM1 276G>T), were investigated. Findings varied with leptin genotype, the risks being decreased with LEP 19AA (OR=0.7, 95% CI 0.5–1.0) and increased with LEP −2548GA (OR=1.3, 95% CI 1.0–1.7) and −2548AA (OR=1.4, 95% CI 1.0–1.9), particularly for follicular lymphoma. These genetic findings, which were independent of BMI, were stronger for men than women

    Novel, Meso-Substituted Cationic Porphyrin Molecule for Photo-Mediated Larval Control of the Dengue Vector Aedes aegypti

    Get PDF
    Dengue is a life-threatening viral disease of growing importance, transmitted by Aedes mosquito vectors. The control of mosquito larvae is crucial to contain or prevent disease outbreaks, and the discovery of new larvicides able to increase the efficacy and the flexibility of the vector control approach is highly desirable. Porphyrins are a class of molecules which generate reactive oxygen species if excited by visible light, thus inducing oxidative cell damage and cell death. In this study we aimed at assessing the potential of this photo-mediated cytotoxic mechanism to kill Aedes (Stegomyia) aegypti mosquito larvae. The selected porphyrin molecule, meso-tri(N-methylpyridyl),meso-mono(N-tetradecylpyridyl)porphine (C14 for simplicity), killed the larvae at doses lower than 1 µM, and at light intensities 50–100 times lower than those typical of natural sunlight, by damaging their intestinal tissues. The physicochemical properties of C14 make it easily adsorbed into organic material, and we exploited this feature to prepare an ‘insecticidal food’ which efficiently killed the larvae and remained active for at least 14 days after its dispersion in water. This study demonstrated that photo-sensitizing agents are promising tools for the development of new larvicides against mosquito vectors of dengue and other human and animal diseases

    PROBING GRAVITY IN NEO'S WITH HIGH-ACCURACY LASER-RANGED TEST MASSES

    Get PDF
    Received 9 August 2006Communicated by S. G. TuryshevGravity can be studied in detail in near Earth orbits NEO's using laser-ranged testmasses tracked with few-mm accuracy by ILRS. The two LAGEOS satellites have beenused to measure frame dragging (a truly rotational effect predicted by GR) with a 10%error. A new mission and an optimized, second generation satellite, LARES (I. CiufoliniPI), is in preparation to reach an accuracy of 1% or less on frame dragging, to measuresome PPN parameters, to test the

    L’indagine macrosismica: metodologia, parametri del terremoto, questioni aperte

    Get PDF
    Subito dopo l’evento del 6 aprile 2009, come di consueto è stata realizzata una lunga e complessa indagine macrosismica, promossa dal gruppo operativo QUEST, che ha avuto inizialmente l’obiettivo di delimitare l’area di danneggiamento, a supporto delle attività di pronto intervento della Protezione Civile, e successivamente quello di classificare nel modo più accurato e capillare possibile, gli effetti prodotti dall’evento, particolarmente nelle aree danneggiate. A questo scopo è stata prodotta una stima utilizzando la scala MCS (Sieberg, 1930); in un secondo momento è stata rifinita l’indagine per una cinquantina di località dell’area maggiormente danneggiata (Is MCS>VII), raccogliendo ed elaborando i dati in termini di scala macrosismica EMS98 (Grünthal, 1998). Per la complessità e la dimensione dei problemi affrontati, questo terremoto ha costituito un banco di prova di grande importanza per la macrosismologia italiana. In questo testo viene descritto il lavoro realizzato, discutendo in particolare alcuni aspetti che hanno messo alla prova le metodologie di indagine tradizionali (sistematiche irregolarità degli insediamenti monitorati, forti divergenze degli scenari di danno rispetto a quelli previsti dalle scale, difficile comparabilità con scenari storici, ecc.) e presentandone i risultati, in relazione ai parametri epicentrali che ne risultano e il loro contributo più diretto alla comprensione complessiva della sismicità dell’area.Published49-551.11. TTC - Osservazioni e monitoraggio macrosismico del territorio nazionaleN/A or not JCRope
    corecore