15,153 research outputs found
Subsidization Competition: Vitalizing the Neutral Internet
Unlike telephone operators, which pay termination fees to reach the users of
another network, Internet Content Providers (CPs) do not pay the Internet
Service Providers (ISPs) of users they reach. While the consequent cross
subsidization to CPs has nurtured content innovations at the edge of the
Internet, it reduces the investment incentives for the access ISPs to expand
capacity. As potential charges for terminating CPs' traffic are criticized
under the net neutrality debate, we propose to allow CPs to voluntarily
subsidize the usagebased fees induced by their content traffic for end-users.
We model the regulated subsidization competition among CPs under a neutral
network and show how deregulation of subsidization could increase an access
ISP's utilization and revenue, strengthening its investment incentives.
Although the competition might harm certain CPs, we find that the main cause
comes from high access prices rather than the existence of subsidization. Our
results suggest that subsidization competition will increase the
competitiveness and welfare of the Internet content market; however, regulators
might need to regulate access prices if the access ISP market is not
competitive enough. We envision that subsidization competition could become a
viable model for the future Internet
Analysis of the contour structural irregularity of skin lesions using wavelet decomposition
The boundary irregularity of skin lesions is of clinical significance for the early detection of
malignant melanomas and to distinguish them from other lesions such as benign moles. The
structural components of the contour are of particular importance. To extract the structure from
the contour, wavelet decomposition was used as these components tend to locate in the lower
frequency sub-bands. Lesion contours were modeled as signatures with scale normalization to
give position and frequency resolution invariance. Energy distributions among different wavelet
sub-bands were then analyzed to extract those with significant levels and differences to enable
maximum discrimination.
Based on the coefficients in the significant sub-bands, structural components from the original
contours were modeled, and a set of statistical and geometric irregularity descriptors researched
that were applied at each of the significant sub-bands. The effectiveness of the descriptors was
measured using the Hausdorff distance between sets of data from melanoma and mole contours.
The best descriptor outputs were input to a back projection neural network to construct a
combined classifier system. Experimental results showed that thirteen features from four
sub-bands produced the best discrimination between sets of melanomas and moles, and that a
small training set of nine melanomas and nine moles was optimum
Paid Peering, Settlement-Free Peering, or Both?
With the rapid growth of congestion-sensitive and data-intensive
applications, traditional settlement-free peering agreements with best-effort
delivery often do not meet the QoS requirements of content providers (CPs).
Meanwhile, Internet access providers (IAPs) feel that revenues from end-users
are not sufficient to recoup the upgrade costs of network infrastructures.
Consequently, some IAPs have begun to offer CPs a new type of peering
agreement, called paid peering, under which they provide CPs with better data
delivery quality for a fee. In this paper, we model a network platform where an
IAP makes decisions on the peering types offered to CPs and the prices charged
to CPs and end-users. We study the optimal peering schemes for the IAP, i.e.,
to offer CPs both the paid and settlement-free peering to choose from or only
one of them, as the objective is profit or welfare maximization. Our results
show that 1) the IAP should always offer the paid and settlement-free peering
under the profit-optimal and welfare-optimal schemes, respectively, 2) whether
to simultaneously offer the other peering type is largely driven by the type of
data traffic, e.g., text or video, and 3) regulators might want to encourage
the IAP to allocate more network capacity to the settlement-free peering for
increasing user welfare
On Optimal Service Differentiation in Congested Network Markets
As Internet applications have become more diverse in recent years, users
having heavy demand for online video services are more willing to pay higher
prices for better services than light users that mainly use e-mails and instant
messages. This encourages the Internet Service Providers (ISPs) to explore
service differentiations so as to optimize their profits and allocation of
network resources. Much prior work has focused on the viability of network
service differentiation by comparing with the case of a single-class service.
However, the optimal service differentiation for an ISP subject to resource
constraints has remained unsolved. In this work, we establish an optimal
control framework to derive the analytical solution to an ISP's optimal service
differentiation, i.e. the optimal service qualities and associated prices. By
analyzing the structures of the solution, we reveal how an ISP should adjust
the service qualities and prices in order to meet varying capacity constraints
and users' characteristics. We also obtain the conditions under which ISPs have
strong incentives to implement service differentiation and whether regulators
should encourage such practices
Sampling Online Social Networks via Heterogeneous Statistics
Most sampling techniques for online social networks (OSNs) are based on a
particular sampling method on a single graph, which is referred to as a
statistics. However, various realizing methods on different graphs could
possibly be used in the same OSN, and they may lead to different sampling
efficiencies, i.e., asymptotic variances. To utilize multiple statistics for
accurate measurements, we formulate a mixture sampling problem, through which
we construct a mixture unbiased estimator which minimizes asymptotic variance.
Given fixed sampling budgets for different statistics, we derive the optimal
weights to combine the individual estimators; given fixed total budget, we show
that a greedy allocation towards the most efficient statistics is optimal. In
practice, the sampling efficiencies of statistics can be quite different for
various targets and are unknown before sampling. To solve this problem, we
design a two-stage framework which adaptively spends a partial budget to test
different statistics and allocates the remaining budget to the inferred best
statistics. We show that our two-stage framework is a generalization of 1)
randomly choosing a statistics and 2) evenly allocating the total budget among
all available statistics, and our adaptive algorithm achieves higher efficiency
than these benchmark strategies in theory and experiment
Recommended from our members
Video-based tutorial on web design for the technophobic teacher
The aims of this project hope to trace the factors affecting teachers\u27 use of technology, with a concentration on Internet usage, and offer steps in helping teachers move toward integrating the Internet into their curriculum
- …