3,900 research outputs found

    Subscriber churn in the Australian ISP market

    Get PDF
    Rapid growth in Internet use, combined with easy market entry by Internet service providers (ISPs), has resulted in a highly competitive supply of Internet services. Australian ISPs range in size from a few large national operators to niche ISPs focused on specialised service. With many ISPs currently not profitable, subscriber retention is an important aspect of survival. This study develops a model which relates the probability of subscriber churn to various service attributes and subscriber characteristics. Estimation results show that churn probability is positively associated with monthly ISP expenditure, but inversely related to household income. Pricing also matters with subscribers preferring ISPs which offer flat-rate pricing arrangements.

    Plant species effects on soil organic matter turnover and nutrient release in forests and grasslands

    Get PDF
    1996 Fall.Includes bibliographic references (pages 23-27).Although feedbacks between plant species and ecosystem dynamics have been demonstrated in a variety of terrestrial ecosystems, little research has examined the mechanistic relationship between plant species characteristics, the formation and turnover of soil carbon and nitrogen pools, and ecosystem processes such as net N mineralization. My objective was to examine two possible effects of species on soil C and N dynamics; changes in organic matter quality and changes in soil aggregation. For several forest ecosystems, litter lignin:N ratio correlated negatively (non-linear) with net N mineralization, but the relationship did not apply to grass species. Climatic factors (temperature, precipitation) explained little of the variation in net N mineralization. The relationship between litter lignin:N ratio and net N mineralization from mineral soil and the forest floor was similar, suggesting that plant litter quality affects both forest floor and mineral soil organic matter quality. For tree species monocultures in Wisconsin, net N mineralization during 387 day laboratory incubations indicated that species alter the quality of readily decomposable pools of soil N, and have little effect on more recalcitrant soil N. Changes in the quality of soil N correlated positively with in situ net N mineralization. Grass species did not influence N mineralization. Neither grass nor tree species influenced soil C dynamics, but differences in soil characteristics between sites influenced soil C dynamics. Soil microbes appear to act as a “decay filter”, converting heterogeneous plant material into relatively homogeneous soil humus. Changes in soil aggregate size distribution should alter whole-soil C and N quality because different size aggregates contain organic matter of different quality. Although tree species slightly altered aggregate size distribution, aggregate size distribution related poorly to whole-soil C and net N mineralization. Tree species had no effect on the physical protection of organic matter in soil aggregates or on organic matter quality of different size aggregates. Species characteristics had little effect on soil C mineralization, but species-related changes in the quality of readily decomposable soil N pools (not the pool size) influenced net N mineralization. This suggests that the feedbacks between plant species and soil N cycling occur rapidly, ensuring an adequate nutrient supply when plant community structure changes

    Subscriber churn in the Australian ISP market

    Get PDF
    Rapid growth in Internet use, combined with easy market entry by Internet service providers (ISPs), has resulted in a highly competitive supply of Internet services. Australian ISPs range in size from a few large national operators to niche ISPs focused on specialised service. With many ISPs currently not profitable, subscriber retention is an important aspect of survival. This study develops a model which relates the probability of subscriber churn to various service attributes and subscriber characteristics. Estimation results show that churn probability is positively associated with monthly ISP expenditure, but inversely related to household income. Pricing also matters with subscribers preferring ISPs which offer flat-rate pricing arrangements.Internet; customer churn; pricing

    Advanced communications policy and adoption in rural Western Australia

    Get PDF
    Recent moves toward contestable universal service markets for rural areas raises issues of measuring the net cost of service provision. Measurement of net cost requires estimates of latent demand for advanced communications. This paper seeks for the first time to provide quantitative estimates of the magnitude of latent income pools available to carriers in rural WA. Estimates of latent expenditure on broadband services in rural WA are obtained using a combination of stated-preference and survey data. These expenditures increase with computer ownership, community isolation and information need. Further, the statistical model supports the commonly held belief that more distant populations have stronger information demands and are willing to pay for services. This finding suggests that carrier aversion to providing services to rural regions may not be justified on commercial grounds.Advanced communications; broadband service; internet rural access; universal service obligations

    A new VLA/e-MERLIN limit on central images in the gravitational lens system CLASS B1030+074

    Get PDF
    We present new VLA 22-GHz and e-MERLIN 5-GHz observations of CLASS B1030+074, a two-image strong gravitational lens system whose background source is a compact flat-spectrum radio quasar. In such systems we expect a third image of the background source to form close to the centre of the lensing galaxy. The existence and brightness of such images is important for investigation of the central mass distributions of lensing galaxies, but only one secure detection has been made so far in a galaxy-scale lens system. The noise levels achieved in our new B1030+074 images reach 3 microJy/beam and represent an improvement in central image constraints of nearly an order of magnitude over previous work, with correspondingly better resulting limits on the shape of the central mass profile of the lensing galaxy. Simple models with an isothermal outer power law slope now require either the influence of a central supermassive black hole, or an inner power law slope very close to isothermal, in order to suppress the central image below our detection limit. Using the central mass profiles inferred from light distributions in Virgo galaxies, moved to z=0.5, and matching to the observed Einstein radius, we now find that 45% of such mass profiles should give observable central images, 10% should give central images with a flux density still below our limit, and the remaining systems have extreme demagnification produced by the central SMBH. Further observations of similar objects will therefore allow proper statistical constraints to be placed on the central properties of elliptical galaxies at high redshift.Comment: Accepted by Monthly Notices of the Royal Astronomical Society. 16 pages, 8 figure

    Approximate Data Structures with Applications

    Get PDF
    In this paper we introduce the notion of approximate data structures, in which a small amount of error is tolerated in the output. Approximate data structures trade error of approximation for faster operation, leading to theoretical and practical speedups for a wide variety of algorithms. We give approximate variants of the van Emde Boas data structure, which support the same dynamic operations as the standard van Emde Boas data structure [28, 201, except that answers to queries are approximate. The variants support all operations in constant time provided the error of approximation is l/polylog(n), and in O(loglog n) time provided the error is l/polynomial(n), for n elements in the data structure. We consider the tolerance of prototypical algorithms to approximate data structures. We study in particular Prim’s minimumspanning tree algorithm, Dijkstra’s single-source shortest paths algorithm, and an on-line variant of Graham’s convex hull algorithm. To obtain output which approximates the desired output with the error of approximation tending to zero, Prim’s algorithm requires only linear time, Dijkstra’s algorithm requires O(mloglogn) time, and the on-line variant of Graham’s algorithm requires constant amortized time per operation

    Design Drivers for a Viable Commercial Remote Sensing Space Architecture

    Get PDF
    Private sector investment into new commercial remote sensing constellations over the past five years has exceeded 1B.Thecapabilitiesofthesesystems—manystillindevelopmentoratpre−initialoperationalcapability(IOC)thresholds—areintendedtoaddressacombinationofneworunderservedglobalmarkets.Additionally,thecombinationofever−increasingtechnicalcollectionrequirementswithstaticprogrammaticresources,hasnowdrivenhistoricallytraditionalspacesystemoperatorstoembracehybridarchitecturesthatleveragecommercially−sourceddataintotheirservicebaselines.Whileenticingtomanynewentrants,manyconsiderationsmustbepracticallyaddressedtofieldanenduring,commerciallyviablespacearchitecturesolution.Foremostitmustdelivertheexpectedtypeofdataatsufficientqualitythatcanbedirectlyutilizedbyestablishedusersalreadysourcingother(typicallyexquisite)collections.Deliveryofthiscapabilitymustalsonecessarilyberesilient,withbusinesscontinuitysecuredthroughamixtureofcustomersthattranscendsventure−backedinvestmentstoapostureofsustainedprofitably.In2017,Maxar(thenoperatingasDigitalGlobe)decidedtoproceedwiththeself−financeddevelopmentofanew1B. The capabilities of these systems—many still in development or at pre-initial operational capability (IOC) thresholds—are intended to address a combination of new or underserved global markets. Additionally, the combination of ever-increasing technical collection requirements with static programmatic resources, has now driven historically traditional space system operators to embrace hybrid architectures that leverage commercially-sourced data into their service baselines. While enticing to many new entrants, many considerations must be practically addressed to field an enduring, commercially viable space architecture solution. Foremost it must deliver the expected type of data at sufficient quality that can be directly utilized by established users already sourcing other (typically exquisite) collections. Delivery of this capability must also necessarily be resilient, with business continuity secured through a mixture of customers that transcends venture-backed investments to a posture of sustained profitably. In 2017, Maxar (then operating as DigitalGlobe) decided to proceed with the self-financed development of a new 600M Earth observation constellation comprised of six high-resolution satellites that are only 30% of the weight of the prior generation, but leverage technological advances for affordability and performance. Before doing so, however, a rigorous system engineering and business analysis study was undertaken to thoroughly understand customer key performance parameters (KPP) and design drivers to be addressed to ensure a delivered combination of product-market fit, flexibility/adaptability to evolving requirements, and overall capital efficiency. In this paper, we describe this effort to develop our design baseline and the corresponding operational commercial remote sensing constellation that will achieve its new IOC in 2021 to directly support both dedicated commercial and hybrid mission operator architectures

    Experimental Investigation of the NASA Common Research Model with a Natural Laminar Flow Wing in the NASA Langley National Transonic Facility

    Get PDF
    A test of the new NASA Common Research Model with a Natural Laminar Flow (CRMNLF) semispan wing in the NASA Langley National Transonic Facility (NTF) was completed in October 2018. The main focus of this test was the evaluation of the extent of laminar flow on the CRM-NLF wing at various Reynolds numbers and test conditions. During this test, data were acquired at chord Reynolds numbers from 10 to 30 million and at Mach numbers ranging from 0.84 to 0.86. This investigation provided valuable insight into the necessary procedures for laminar flow testing in the NTF. It also significantly advanced the new carbonbased heating layer technique to improve the quality of transition visualization data from temperature sensitive paint (TSP) in a cryogenic wind tunnel

    A Global Fit of Non-Relativistic Effective Dark Matter Operators Including Solar Neutrinos

    Full text link
    We perform a global fit of dark matter interactions with nucleons using a non-relativistic effective operator description, considering both direct detection and neutrino data. We examine the impact of combining the direct detection experiments CDMSlite, CRESST-II, CRESST-III, DarkSide-50, LUX, LZ, PandaX-II, PandaX-4T, PICO-60, SIMPLE, SuperCDMS, XENON100, and XENON1T along with neutrino data from IceCube and ANTARES. While current neutrino telescope data lead to increased sensitivity compared to underground nuclear scattering experiments for dark matter masses above 100 GeV, our future projections show that the next generation of underground experiments will significantly outpace solar searches for most dark matter-nucleon elastic scattering interactions.Comment: 12+9 pages, 26 figures, Likelihoods available at https://zenodo.org/records/1003221
    • 

    corecore