245 research outputs found
Recommended from our members
Mitigating the Effect of Free-Riders in BitTorrent using Trusted Agents
Even though Peer-to-Peer (P2P) systems present a cost-effective and scalable solution to content distribution, most entertainment, media and software, content providers continue to rely on expensive, centralized solutions such as Content Delivery Networks. One of the main reasons is that the current P2P systems cannot guarantee reasonable performance as they depend on the willingness of users to contribute bandwidth. Moreover, even systems like BitTorrent, which employ a tit-for-tat protocol to encourage fair bandwidth exchange between users, are prone to free-riding (i.e. peers that do not upload). Our experiments on PlanetLab extend previous research (e.g. LargeViewExploit, BitTyrant) demonstrating that such selfish behavior can seriously degrade the performance of regular users in many more scenarios beyond simple free-riding: we observed an overhead of up to 430% for 80% of free-riding identities easily generated by a small set of selfish users. To mitigate the effects of selfish users, we propose a new P2P architecture that classifies peers with the help of a small number of {\em trusted nodes} that we call Trusted Auditors (TAs). TAs participate in P2P download like regular clients and detect free-riding identities by observing their neighbors' behavior. Using TAs, we can separate compliant users into a separate service pool resulting in better performance. Furthermore, we show that TAs are more effective ensuring the performance of the system than a mere increase in bandwidth capacity: for 80\% of free-riding identities a single-TA system has a 6\% download time overhead while without the TA and three times the bandwidth capacity we measure a 100\% overhead
Recommended from our members
Aequitas: A Trusted P2P System for Paid Content Delivery
P2P file-sharing has been recognized as a powerful and efficient distribution model due to its ability to leverage users' upload bandwidth. However, companies that sell digital content on-line are hesitant to rely on P2P models for paid content distribution due to the free file-sharing inherent in P2P models. In this paper we present Aequitas, a P2P system in which users share paid content anonymously via a layer of intermediate nodes. We argue that with the extra anonymity in Aequitas, vendors could leverage P2P bandwidth while effectively maintaining the same level of trust towards their customers as in traditional models of paid content distribution. As a result, a content provider could reduce its infrastructure costs and subsequently lower the costs for the end-users. The intermediate nodes are incentivized to contribute their bandwidth via electronic micropayments. We also introduce techniques that prevent the intermediate nodes from learning the content of the files they help transmit. In this paper we present the design of our system, an analysis of its properties and an implementation and experimental evaluation. We quantify the value of the intermediate nodes, both in terms of efficiency and their effect on anonoymity. We argue in support of the economic and technological merits of the system
Recommended from our members
A Case for P2P Delivery of Paid Content
P2P file sharing provides a powerful content distribution model by leveraging users' computing and bandwidth resources. However, companies have been reluctant to rely on P2P systems for paid content distribution due to their inability to limit the exploitation of these systems for free file sharing. We present TP2, a system that combines the more cost-effective and scalable distribution capabilities of P2P systems with a level of trust and control over content distribution similar to direct download content delivery networks. TP2 uses two key mechanisms that can be layered on top of existing P2P systems. First, it provides strong authentication to prevent free file sharing in the system. Second, it introduces a new notion of trusted auditors to detect and limit malicious attempts to gain information about participants in the system to facilitate additional out-of-band free file sharing. We analyze TP2 by modeling it as a novel game between malicious users who try to form free file sharing clusters and trusted auditors who curb the growth of such clusters. Our analysis shows that a small fraction of trusted auditors is sufficient to protect the P2P system against unauthorized file sharing. Using a simple economic model, we further show that TP2 provides a more cost-effective content distribution solution, resulting in higher profits for a content provider even in the presence of a large percentage of malicious users. Finally, we implemented TP2 on top of BitTorrent and use PlanetLab to show that our system can provide trusted P2P file sharing with negligible performance overhead
Recommended from our members
Can P2P Replace Direct Download for Content Distribution
While peer-to-peer (P2P) file-sharing is a powerful and cost-effective content distribution model, most paid-for digital-content providers (CPs) rely on direct download to deliver their content. CPs such as Apple iTunes that command a large base of paying users are hesitant to use a P2P model that could easily degrade their user base into yet another free file-sharing community. We present TP2, a system that makes P2P file sharing a viable delivery mechanism for paid digital content by providing the same security properties as the currently used direct-download model.} introduces the novel notion of trusted auditors (TAs) -- P2P peers that are controlled by the system operator. TAs monitor the behavior of other peers and help detect and prevent formation of illegal file-sharing clusters among the CP's user base. TAs both complement and exploit the strong authentication and authorization mechanisms that are used in TP2 to control access to content. It is important to note that TP2 does not attempt to solve the out-of-band file-sharing or DRM problems, which also exist in the direct-download systems currently in use. We analyze TP2 by modeling it as a novel game between misbehaving users who try to form unauthorized file-sharing clusters and TAs who curb the growth of such clusters. Our analysis shows that a small fraction of TAs is sufficient to protect the P2P system against unauthorized file sharing. In a system with as many as 60\% of misbehaving users, even a small fraction of TAs can detect 99\% of unauthorized cluster formation. We developed a simple economic model to show that even with such a large fraction of malicious nodes, TP2 can improve CP's profits (which could translate to user savings) by 62 to 122\%, even while assuming conservative estimates of content and bandwidth costs. We implemented TP2 as a layer on top of BitTorrent and demonstrated experimentally using PlanetLab that our system provides trusted P2P file sharing with negligible performance overhead
Statistical criteria for characterizing irradiance time series.
We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico
Global horizontal irradiance clear sky models : implementation and analysis.
Clear sky models estimate the terrestrial solar radiation under a cloudless sky as a function of the solar elevation angle, site altitude, aerosol concentration, water vapor, and various atmospheric conditions. This report provides an overview of a number of global horizontal irradiance (GHI) clear sky models from very simple to complex. Validation of clear-sky models requires comparison of model results to measured irradiance during clear-sky periods. To facilitate validation, we present a new algorithm for automatically identifying clear-sky periods in a time series of GHI measurements. We evaluate the performance of selected clear-sky models using measured data from 30 different sites, totaling about 300 site-years of data. We analyze the variation of these errors across time and location. In terms of error averaged over all locations and times, we found that complex models that correctly account for all the atmospheric parameters are slightly more accurate than other models, but, primarily at low elevations, comparable accuracy can be obtained from some simpler models. However, simpler models often exhibit errors that vary with time of day and season, whereas the errors for complex models vary less over time
Introductory clifford analysis
In this chapter an introduction is given to Clifford analysis and the underlying Clifford algebras. The functions under consideration are defined on Euclidean space and take values in the universal real or complex Clifford algebra, the structure and properties of which are also recalled in detail. The function theory is centered around the notion of a monogenic function, which is a null solution of a generalized Cauchy–Riemann operator, which is rotation invariant and factorizes the Laplace operator. In this way, Clifford analysis may be considered as both a generalization to higher dimension of the theory of holomorphic functions in the complex plane and a refinement of classical harmonic analysis. A notion of monogenicity may also be associated with the vectorial part of the Cauchy–Riemann operator, which is called the Dirac operator; some attention is paid to the intimate relation between both notions. Since a product of monogenic functions is, in general, no longer monogenic, it is crucial to possess some tools for generating monogenic functions: such tools are provided by Fueter’s theorem on one hand and the Cauchy–Kovalevskaya extension theorem on the other hand. A corner stone in this function theory is the Cauchy integral formula for representation of a monogenic function in the interior of its domain of monogenicity. Starting from this representation formula and related integral formulae, it is possible to consider integral transforms such as Cauchy, Hilbert, and Radon transforms, which are important both within the theoretical framework and in view of possible applications
A Standardised Procedure for Evaluating Creative Systems: Computational Creativity Evaluation Based on What it is to be Creative
Computational creativity is a flourishing research area, with a variety of creative systems being produced and developed. Creativity evaluation has not kept pace with system development with an evident lack of systematic evaluation of the creativity of these systems in the literature. This is partially due to difficulties in defining what it means for a computer to be creative; indeed, there is no consensus on this for human creativity, let alone its computational equivalent. This paper proposes a Standardised Procedure for Evaluating Creative Systems (SPECS). SPECS is a three-step process: stating what it means for a particular computational system to be creative, deriving and performing tests based on these statements. To assist this process, the paper offers a collection of key components of creativity, identified empirically from discussions of human and computational creativity. Using this approach, the SPECS methodology is demonstrated through a comparative case study evaluating computational creativity systems that improvise music
A proposal for a coordinated effort for the determination of brainwide neuroanatomical connectivity in model organisms at a mesoscopic scale
In this era of complete genomes, our knowledge of neuroanatomical circuitry
remains surprisingly sparse. Such knowledge is however critical both for basic
and clinical research into brain function. Here we advocate for a concerted
effort to fill this gap, through systematic, experimental mapping of neural
circuits at a mesoscopic scale of resolution suitable for comprehensive,
brain-wide coverage, using injections of tracers or viral vectors. We detail
the scientific and medical rationale and briefly review existing knowledge and
experimental techniques. We define a set of desiderata, including brain-wide
coverage; validated and extensible experimental techniques suitable for
standardization and automation; centralized, open access data repository;
compatibility with existing resources, and tractability with current
informatics technology. We discuss a hypothetical but tractable plan for mouse,
additional efforts for the macaque, and technique development for human. We
estimate that the mouse connectivity project could be completed within five
years with a comparatively modest budget.Comment: 41 page
- …