8,615 research outputs found

    On requirements for a satellite mission to measure tropical rainfall

    Get PDF
    Tropical rainfall data are crucial in determining the role of tropical latent heating in driving the circulation of the global atmosphere. Also, the data are particularly important for testing the realism of climate models, and their ability to simulate and predict climate accurately on the seasonal time scale. Other scientific issues such as the effects of El Nino on climate could be addressed with a reliable, extended time series of tropical rainfall observations. A passive microwave sensor is planned to provide information on the integrated column precipitation content, its areal distribution, and its intensity. An active microwave sensor (radar) will define the layer depth of the precipitation and provide information about the intensity of rain reaching the surface, the key to determining the latent heat input to the atmosphere. A visible/infrared sensor will provide very high resolution information on cloud coverage, type, and top temperatures and also serve as the link between these data and the long and virtually continuous coverage by the geosynchronous meteorological satellites. The unique combination of sensor wavelengths, coverages, and resolving capabilities together with the low-altitude, non-Sun synchronous orbit provide a sampling capability that should yield monthly precipitation amounts to a reasonable accuracy over a 500- by 500-km grid

    Accurate and efficient SLA compliance monitoring

    Full text link

    Measuring And Improving Internet Video Quality Of Experience

    Get PDF
    Streaming multimedia content over the IP-network is poised to be the dominant Internet traffic for the coming decade, predicted to account for more than 91% of all consumer traffic in the coming years. Streaming multimedia content ranges from Internet television (IPTV), video on demand (VoD), peer-to-peer streaming, and 3D television over IP to name a few. Widespread acceptance, growth, and subscriber retention are contingent upon network providers assuring superior Quality of Experience (QoE) on top of todays Internet. This work presents the first empirical understanding of Internet’s video-QoE capabilities, and tools and protocols to efficiently infer and improve them. To infer video-QoE at arbitrary nodes in the Internet, we design and implement MintMOS: a lightweight, real-time, noreference framework for capturing perceptual quality. We demonstrate that MintMOS’s projections closely match with subjective surveys in accessing perceptual quality. We use MintMOS to characterize Internet video-QoE both at the link level and end-to-end path level. As an input to our study, we use extensive measurements from a large number of Internet paths obtained from various measurement overlays deployed using PlanetLab. Link level degradations of intra– and inter–ISP Internet links are studied to create an empirical understanding of their shortcomings and ways to overcome them. Our studies show that intra–ISP links are often poorly engineered compared to peering links, and that iii degradations are induced due to transient network load imbalance within an ISP. Initial results also indicate that overlay networks could be a promising way to avoid such ISPs in times of degradations. A large number of end-to-end Internet paths are probed and we measure delay, jitter, and loss rates. The measurement data is analyzed offline to identify ways to enable a source to select alternate paths in an overlay network to improve video-QoE, without the need for background monitoring or apriori knowledge of path characteristics. We establish that for any unstructured overlay of N nodes, it is sufficient to reroute key frames using a random subset of k nodes in the overlay, where k is bounded by O(lnN). We analyze various properties of such random subsets to derive simple, scalable, and an efficient path selection strategy that results in a k-fold increase in path options for any source-destination pair; options that consistently outperform Internet path selection. Finally, we design a prototype called source initiated frame restoration (SIFR) that employs random subsets to derive alternate paths and demonstrate its effectiveness in improving Internet video-QoE

    New insights from low-temperature thermochronology into the tectonic and geomorphologic evolution of the south-eastern Brazilian highlands and passive margin

    Get PDF
    The South Atlantic passive margin along the south-eastern Brazilian highlands exhibits a complex landscape, including a northern inselberg area and a southern elevated plateau, separated by the Doce River valley. This landscape is set on the Proterozoic to early Paleozoic rocks of the region that once was the hot core of the Aracuaf orogen, in Ediacaran to Ordovician times. Due to the break-up of Gondwana and consequently the opening of the South Atlantic during the Early Cretaceous, those rocks of the Aracuaf orogen became the basement of a portion of the South Atlantic passive margin and related southeastern Brazilian highlands. Our goal is to provide a new set of constraints on the thermo-tectonic history of this portion of the south-eastern Brazilian margin and related surface processes, and to provide a hypothesis on the geodynamic context since break-up. To this end, we combine the apatite fission track (AFT) and apatite (U-Th)/He (AHe) methods as input for inverse thermal history modelling. All our AFT and AHe central ages are Late Cretaceous to early Paleogene. The AFT ages vary between 62 Ma and 90 Ma, with mean track lengths between 12.2 mu m and 13.6 mu m. AHe ages are found to be equivalent to AFT ages within uncertainty, albeit with the former exhibiting a lesser degree of confidence. We relate this Late Cretaceous-Paleocene basement cooling to uplift with accelerated denudation at this time. Spatial variation of the denudation time can be linked to differential reactivation of the Precambrian structural network and differential erosion due to a complex interplay with the drainage system. We argue that posterior large-scale sedimentation in the offshore basins may be a result of flexural isostasy combined with an expansion of the drainage network. We put forward the combined compression of the Mid-Atlantic ridge and the Peruvian phase of the Andean orogeny, potentially augmented through the thermal weakening of the lower crust by the Trindade thermal anomaly, as a probable cause for the uplift. (C) 2019, China University of Geosciences (Beijing) and Peking University. Production and hosting by Elsevier B.V

    Mixing and Deposition in a Jack Pine Forest Canopy

    Get PDF
    To study how aerosols mix and deposit to forests, a tower was erected in a jack pine forest as part of the York Athabasca Jack Pine project. The tower is surrounded by anthropogenic pollution sources from the Alberta Oil Sands operations. From previous studies, we expected that canopies inhibit mixing and deposition. During the study, the air within the forest was often decoupled from the air above. Mixing at the study site took up to 40 minutes during periods where the canopy was decoupled, compared to less than 2 minutes when the canopy was coupled. At different times during the campaign, the forest was either a sink or a source of aerosols. The mean aerosol deposition velocity, an important parameter used by deposition models, was measured in this boreal forest. A local minimum of v_d (with respect to particle diameter) of 0.16 cm/s was observed at D = 150 nm

    Applying a Dynamical Systems Model and Network Theory to Major Depressive Disorder

    Get PDF
    Mental disorders like major depressive disorder can be seen as complex dynamical systems. In this study we investigate the dynamic behaviour of individuals to see whether or not we can expect a transition to another mood state. We introduce a mean field model to a binomial process, where we reduce a dynamic multidimensional system (stochastic cellular automaton) to a one-dimensional system to analyse the dynamics. Using maximum likelihood estimation, we can estimate the parameter of interest which, in combination with a bifurcation diagram, reflects the expectancy that someone has to transition to another mood state. After validating the proposed method with simulated data, we apply this method to two empirical examples, where we show its use in a clinical sample consisting of patients diagnosed with major depressive disorder, and a general population sample. Results showed that the majority of the clinical sample was categorized as having an expectancy for a transition, while the majority of the general population sample did not have this expectancy. We conclude that the mean field model has great potential in assessing the expectancy for a transition between mood states. With some extensions it could, in the future, aid clinical therapists in the treatment of depressed patients.Comment: arXiv admin note: text overlap with arXiv:1610.0504

    Real world evaluation of techniques for mitigating the impact of packet losses on TCP performance

    Get PDF
    The real-world impact of network losses on the performance of Transmission Control Protocol (TCP), the dominant transport protocol used for Internet data transfer, is not well understood. A detailed understanding of this impact and the efficiency of TCP in dealing with losses would prove useful for optimizing TCP design. Past work in this area is limited in its accuracy, depth of analysis, and scale. In this dissertation, we make three main contributions to address these issues: (i) design a methodology for in-depth and accurate passive analysis of TCP traces, (ii) systematically evaluate the impact of design parameters associated with TCP loss detection/recovery mechanisms on its performance, and (iii) systematically evaluate the ability of Delay Based Congestion Estimators (DBCEs) to predict losses and help avoid them. We develop a passive analysis tool, TCPdebug, which accurately tracks TCP sender state for many prominent OSes (Windows, Linux, Solaris, and FreeBSD/MacOS) and accurately classifies segments that appear out-of-sequence in a TCP trace. This tool has been extensively validated using controlled lab experiments as well as against real Internet connections. Its accuracy exceeds 99%, which is double the accuracy of current loss classification tools. Using TCPdebug, we analyze traces of more than 2.8 million Internet connections to study the efficiency of current TCP loss detection/recovery mechanisms. Using models to capture the impact of configuration of these mechanisms on the durations of TCP connections, we find that the recommended as well as widely implemented configurations for these mechanisms are fairly sub-optimal. Our analysis suggests that the durations of up to 40% of Internet connections can be reduced by more than 10% by reconfiguring prominent TCP stacks. Finally, we investigate the ability of several popular Delay Based Connection Estimators (DBCEs) to predict (and help avoid) losses using estimates of network queuing delay. We find that aggressive predictors work much better than conservative predictors. We also study the impact of connection characteristics--such as packet loss rate, flight size, and throughput--on the performance of a DBCE. We find that high-throughput connections benefit the most from any DBCE. This indicates that DBCEs hold significant promise for future high-speed networks

    Assessment of air quality in Northern China by using the COSMO-ART model in conjunction with satellite and ground-based data

    Get PDF
    Luftverschmutzung durch Aerosole ist eines der größten Umweltprobleme in der chinesischen Hauptstadt Peking. Insbesondere Mineralstaub, welcher oft aus den weitläufigen asiatischen Trockengebieten in das Stadtgebiet eingetragen wird, führt zu einer drastischen Verschlechterung der Luftqualität. Diese Arbeit ist eine detaillierte Studie über die raumzeitliche Dynamik dieses eingetragenen Mineralstaubs sowie dessen physikalische Interaktion mit lokal produzierten anthropogenen Partikeln
    • …
    corecore