170,800 research outputs found

    Subsidization Competition: Vitalizing the Neutral Internet

    Full text link
    Unlike telephone operators, which pay termination fees to reach the users of another network, Internet Content Providers (CPs) do not pay the Internet Service Providers (ISPs) of users they reach. While the consequent cross subsidization to CPs has nurtured content innovations at the edge of the Internet, it reduces the investment incentives for the access ISPs to expand capacity. As potential charges for terminating CPs' traffic are criticized under the net neutrality debate, we propose to allow CPs to voluntarily subsidize the usagebased fees induced by their content traffic for end-users. We model the regulated subsidization competition among CPs under a neutral network and show how deregulation of subsidization could increase an access ISP's utilization and revenue, strengthening its investment incentives. Although the competition might harm certain CPs, we find that the main cause comes from high access prices rather than the existence of subsidization. Our results suggest that subsidization competition will increase the competitiveness and welfare of the Internet content market; however, regulators might need to regulate access prices if the access ISP market is not competitive enough. We envision that subsidization competition could become a viable model for the future Internet

    Paid Peering, Settlement-Free Peering, or Both?

    Full text link
    With the rapid growth of congestion-sensitive and data-intensive applications, traditional settlement-free peering agreements with best-effort delivery often do not meet the QoS requirements of content providers (CPs). Meanwhile, Internet access providers (IAPs) feel that revenues from end-users are not sufficient to recoup the upgrade costs of network infrastructures. Consequently, some IAPs have begun to offer CPs a new type of peering agreement, called paid peering, under which they provide CPs with better data delivery quality for a fee. In this paper, we model a network platform where an IAP makes decisions on the peering types offered to CPs and the prices charged to CPs and end-users. We study the optimal peering schemes for the IAP, i.e., to offer CPs both the paid and settlement-free peering to choose from or only one of them, as the objective is profit or welfare maximization. Our results show that 1) the IAP should always offer the paid and settlement-free peering under the profit-optimal and welfare-optimal schemes, respectively, 2) whether to simultaneously offer the other peering type is largely driven by the type of data traffic, e.g., text or video, and 3) regulators might want to encourage the IAP to allocate more network capacity to the settlement-free peering for increasing user welfare

    A Cosmic Microwave Background Radiation Polarimeter Using Superconducting Bearings

    Full text link
    Measurements of the polarization of the cosmic microwave background (CMB) radiation are expected to significantly increase our understanding of the early universe. We present a design for a CMB polarimeter in which a cryogenically cooled half wave plate rotates by means of a high-temperature superconducting (HTS) bearing. The design is optimized for implementation in MAXIPOL, a balloon-borne CMB polarimeter. A prototype bearing, consisting of commercially available ring-shaped permanent magnet and an array of YBCO bulk HTS material, has been constructed. We measured the coefficient of friction as a function of several parameters including temperature between 15 and 80 K, rotation frequency between 0.3 and 3.5 Hz, levitation distance between 6 and 10 mm, and ambient pressure between 10^{-7} and 1 torr. The low rotational drag of the HTS bearing allows rotations for long periods of time with minimal input power and negligible wear and tear thus making this technology suitable for a future satellite mission.Comment: 6 pages, IEEE-Transactions of Applied Superconductivity, 2003, Vol. 13, in pres

    Heavy quarkonium 2S states in light-front quark model

    Full text link
    We study the charmonium 2S states ψ\psi' and ηc\eta_c', and the bottomonium 2S states Υ\Upsilon' and ηb\eta_b', using the light-front quark model and the 2S state wave function of harmonic oscillator as the approximation of the 2S quarkonium wave function. The decay constants, transition form factors and masses of these mesons are calculated and compared with experimental data. Predictions of quantities such as Br(ψγηc)(\psi' \to \gamma \eta_c') are made. The 2S wave function may help us learn more about the structure of these heavy quarkonia.Comment: 5 latex pages, final version for journal publicatio

    Axial vector form factor of nucleons in a light-cone diquark model

    Get PDF
    The nucleon axial vector form factor is investigated in a light-cone quark spectator diquark model, in which Melosh rotations are applied to both the quark and vector diquark. It is found that this model gives a very good description of available experimental data and the results have very little dependence on the parameters of the model. The relation between the nucleon axial constant and the anomalous magnetic moment of nucleons is also discussed.Comment: 8 pages, Revtex4, 1 figure, version to be published in Phys. Rev.

    On Optimal Service Differentiation in Congested Network Markets

    Full text link
    As Internet applications have become more diverse in recent years, users having heavy demand for online video services are more willing to pay higher prices for better services than light users that mainly use e-mails and instant messages. This encourages the Internet Service Providers (ISPs) to explore service differentiations so as to optimize their profits and allocation of network resources. Much prior work has focused on the viability of network service differentiation by comparing with the case of a single-class service. However, the optimal service differentiation for an ISP subject to resource constraints has remained unsolved. In this work, we establish an optimal control framework to derive the analytical solution to an ISP's optimal service differentiation, i.e. the optimal service qualities and associated prices. By analyzing the structures of the solution, we reveal how an ISP should adjust the service qualities and prices in order to meet varying capacity constraints and users' characteristics. We also obtain the conditions under which ISPs have strong incentives to implement service differentiation and whether regulators should encourage such practices

    Sampling Online Social Networks via Heterogeneous Statistics

    Full text link
    Most sampling techniques for online social networks (OSNs) are based on a particular sampling method on a single graph, which is referred to as a statistics. However, various realizing methods on different graphs could possibly be used in the same OSN, and they may lead to different sampling efficiencies, i.e., asymptotic variances. To utilize multiple statistics for accurate measurements, we formulate a mixture sampling problem, through which we construct a mixture unbiased estimator which minimizes asymptotic variance. Given fixed sampling budgets for different statistics, we derive the optimal weights to combine the individual estimators; given fixed total budget, we show that a greedy allocation towards the most efficient statistics is optimal. In practice, the sampling efficiencies of statistics can be quite different for various targets and are unknown before sampling. To solve this problem, we design a two-stage framework which adaptively spends a partial budget to test different statistics and allocates the remaining budget to the inferred best statistics. We show that our two-stage framework is a generalization of 1) randomly choosing a statistics and 2) evenly allocating the total budget among all available statistics, and our adaptive algorithm achieves higher efficiency than these benchmark strategies in theory and experiment

    An integrated wind risk warning model for urban rail transport in Shanghai, China

    Get PDF
    The integrated wind risk warning model for rail transport presented has four elements: Background wind data, a wind field model, a vulnerability model, and a risk model. Background wind data uses observations in this study. Using the wind field model with effective surface roughness lengths, the background wind data are interpolated to a 30-m resolution grid. In the vulnerability model, the aerodynamic characteristics of railway vehicles are analyzed with CFD (Computational Fluid Dynamics) modelling. In the risk model, the maximum value of three aerodynamic forces is used as the criteria to evaluate rail safety and to quantify the risk level under extremely windy weather. The full model is tested for the Shanghai Metro Line 16 using wind conditions during Typhoon Chan-hom. The proposed approach enables quick quantification of real- time safety risk levels during typhoon landfall, providing sophisticated warning information for rail vehicle operation safety
    corecore