159,630 research outputs found

    Analyzing Peer Selection Policies for BitTorrent Multimedia On-Demand Streaming Systems in Internet

    Get PDF
    The adaptation of the BitTorrent protocol to multimedia on-demand streaming systems essentially lies on the modification of its two core algorithms, namely the piece and the peer selection policies, respectively. Much more attention has though been given to the piece selection policy. Within this context, this article proposes three novel peer selection policies for the design of BitTorrent-like protocols targeted at that type of systems: Select Balanced Neighbour Policy (SBNP), Select Regular Neighbour Policy (SRNP), and Select Optimistic Neighbour Policy (SONP). These proposals are validated through a competitive analysis based on simulations which encompass a variety of multimedia scenarios, defined in function of important characterization parameters such as content type, content size, and client interactivity profile. Service time, number of clients served and efficiency retrieving coefficient are the performance metrics assessed in the analysis. The final results mainly show that the novel proposals constitute scalable solutions that may be considered for real project designs. Lastly, future work is included in the conclusion of this paper.Comment: 19 PAGE

    Speculative Concurrency Control for Real-Time Databases

    Full text link
    In this paper, we propose a new class of Concurrency Control Algorithms that is especially suited for real-time database applications. Our approach relies on the use of (potentially) redundant computations to ensure that serializable schedules are found and executed as early as possible, thus, increasing the chances of a timely commitment of transactions with strict timing constraints. Due to its nature, we term our concurrency control algorithms Speculative. The aforementioned description encompasses many algorithms that we call collectively Speculative Concurrency Control (SCC) algorithms. SCC algorithms combine the advantages of both Pessimistic and Optimistic Concurrency Control (PCC and OCC) algorithms, while avoiding their disadvantages. On the one hand, SCC resembles PCC in that conflicts are detected as early as possible, thus making alternative schedules available in a timely fashion in case they are needed. On the other hand, SCC resembles OCC in that it allows conflicting transactions to proceed concurrently, thus avoiding unnecessary delays that may jeopardize their timely commitment

    IPOs cycle and investment in high-tech industries

    Get PDF
    This paper analyses the effects of the Initial Public Offering (IPO) market on real investment decisions in emerging industries. We first propose a model of IPO timing based on divergence of opinion among investors and short-sale constraints. Using a real option approach, we show that firms are more likely to go public when the ratio of overvaluation over profits is high, that is after stock market run-ups. Because initial returns increase with the demand from optimistic investors at the time of the offer, the model provides an explanation for the observed positive causality between average initial returns and IPO volume. Second, we discuss the possibility of real overinvestment in high-tech industries. We claim that investing in the industry gives agents an option to sell the project on the stock market at an overvalued price enabling then the financing of positive NPV projects which would not be undertaken otherwise. It is shown that the IPO market can however also lead to overinvestment in new industries. Finally, we present some econometric results supporting the idea that funds committed to the financing of high-tech industries may respond positively to optimistic stock market valuations

    PDDLStream: Integrating Symbolic Planners and Blackbox Samplers via Optimistic Adaptive Planning

    Full text link
    Many planning applications involve complex relationships defined on high-dimensional, continuous variables. For example, robotic manipulation requires planning with kinematic, collision, visibility, and motion constraints involving robot configurations, object poses, and robot trajectories. These constraints typically require specialized procedures to sample satisfying values. We extend PDDL to support a generic, declarative specification for these procedures that treats their implementation as black boxes. We provide domain-independent algorithms that reduce PDDLStream problems to a sequence of finite PDDL problems. We also introduce an algorithm that dynamically balances exploring new candidate plans and exploiting existing ones. This enables the algorithm to greedily search the space of parameter bindings to more quickly solve tightly-constrained problems as well as locally optimize to produce low-cost solutions. We evaluate our algorithms on three simulated robotic planning domains as well as several real-world robotic tasks.Comment: International Conference on Automated Planning and Scheduling (ICAPS) 202

    microPhantom: Playing microRTS under uncertainty and chaos

    Full text link
    This competition paper presents microPhantom, a bot playing microRTS and participating in the 2020 microRTS AI competition. microPhantom is based on our previous bot POAdaptive which won the partially observable track of the 2018 and 2019 microRTS AI competitions. In this paper, we focus on decision-making under uncertainty, by tackling the Unit Production Problem with a method based on a combination of Constraint Programming and decision theory. We show that using our method to decide which units to train improves significantly the win rate against the second-best microRTS bot from the partially observable track. We also show that our method is resilient in chaotic environments, with a very small loss of efficiency only. To allow replicability and to facilitate further research, the source code of microPhantom is available, as well as the Constraint Programming toolkit it uses

    What Next-Generation 21 cm Power Spectrum Measurements Can Teach Us About the Epoch of Reionization

    Get PDF
    A number of experiments are currently working towards a measurement of the 21 cm signal from the Epoch of Reionization. Whether or not these experiments deliver a detection of cosmological emission, their limited sensitivity will prevent them from providing detailed information about the astrophysics of reionization. In this work, we consider what types of measurements will be enabled by a next-generation of larger 21 cm EoR telescopes. To calculate the type of constraints that will be possible with such arrays, we use simple models for the instrument, foreground emission, and the reionization history. We focus primarily on an instrument modeled after the 0.1 km2\sim 0.1~\rm{km}^2 collecting area Hydrogen Epoch of Reionization Array (HERA) concept design, and parameterize the uncertainties with regard to foreground emission by considering different limits to the recently described "wedge" footprint in k-space. Uncertainties in the reionization history are accounted for using a series of simulations which vary the ionizing efficiency and minimum virial temperature of the galaxies responsible for reionization, as well as the mean free path of ionizing photons through the IGM. Given various combinations of models, we consider the significance of the possible power spectrum detections, the ability to trace the power spectrum evolution versus redshift, the detectability of salient power spectrum features, and the achievable level of quantitative constraints on astrophysical parameters. Ultimately, we find that 0.1 km20.1~\rm{km}^2 of collecting area is enough to ensure a very high significance (30σ\gtrsim30\sigma) detection of the reionization power spectrum in even the most pessimistic scenarios. This sensitivity should allow for meaningful constraints on the reionization history and astrophysical parameters, especially if foreground subtraction techniques can be improved and successfully implemented.Comment: 27 pages, 18 figures, updated SKA numbers in appendi

    The American consumer: reforming, or just resting?

    Get PDF
    American households have received a triple dose of bad news since the beginning of the current recession: The greatest collapse in asset values since the Great Depression, a sharp tightening in credit availability, and a large increase in unemployment risk. We present measures of the size of these shocks and discuss what a benchmark theory says about their immediate and ultimate consequences. We then provide a forecast based on a simple empirical model that captures the effects of wealth shocks and unemployment fears. Our short-term forecast calls for somewhat weaker spending, and somewhat higher saving rates, than the Consensus survey of macroeconomic forecasters. Over the longer term, our best guess is that the personal saving rate will eventually approach the levels that preceded period of financial liberalization that began in the late 1970s. Classification: C61, D11, E2

    A Lazy Bailout Approach for Dual-Criticality Systems on Uniprocessor Platforms

    Get PDF
    © 2019 by the authors. Licensee MDPI, Basel, Switzerland.A challenge in the design of cyber-physical systems is to integrate the scheduling of tasks of different criticality, while still providing service guarantees for the higher critical tasks in case of resource-shortages caused by faults. While standard real-time scheduling is agnostic to the criticality of tasks, the scheduling of tasks with different criticalities is called mixed-criticality scheduling. In this paper we present the Lazy Bailout Protocol (LBP), a mixed-criticality scheduling method where low-criticality jobs overrunning their time budget cannot threaten the timeliness of high-criticality jobs while at the same time the method tries to complete as many low-criticality jobs as possible. The key principle of LBP is instead of immediately abandoning low-criticality jobs when a high-criticality job overruns its optimistic WCET estimate, to put them in a low-priority queue for later execution. To compare mixed-criticality scheduling methods we introduce a formal quality criterion for mixed-criticality scheduling, which, above all else, compares schedulability of high-criticality jobs and only afterwards the schedulability of low-criticality jobs. Based on this criterion we prove that LBP behaves better than the original {\em Bailout Protocol} (BP). We show that LBP can be further improved by slack time exploitation and by gain time collection at runtime, resulting in LBPSG. We also show that these improvements of LBP perform better than the analogous improvements based on BP.Peer reviewedFinal Published versio

    More Democracy Is Not Better Democracy: Cain's Case for Reform Pluralism

    Get PDF
    This article is part of a symposium on Bruce Cain's "Democracy More or Less: America’s Political Reform Quandary." It identifies the basic normative framework of Cain's skeptical "reform pluralism" as a form of democratic instrumentalism rather than political realism, and then argues that a more optimistic instrumentalist alternative is available. The instrumentalist can accept that more democracy need not entail better democracy. But the instrumentalist account of better democracy also gives us reason to believe that significant reform efforts remain worth pursuing, for the simple reason that some of them have worked in the past
    corecore