1,845 research outputs found

    Di Bao : a guaranteed minimum income in urban China?

    Get PDF
    Concerns about incentives and targeting naturally arise when cash transfers are used to fight poverty. The authors address these concerns in the context of China's Di Bao program, which uses means-tested transfers to try to assure that no registered urban resident has an income below a stipulated poverty line. There is little sign in the data of poverty traps due to high benefit withdrawal rates. Targeting performance is excellent by various measures. Di Bao appears to be better targeted than any other program in the developing world. However, all but one measure of targeting performance is found to be uninformative, or even deceptive, about impacts on poverty. The authors find that the majority of the poor are not receiving help, even with a generous allowance for measurement errors. While on paper, Di Bao would eliminate urban poverty, it falls well short of that ideal in practice.Services&Transfers to Poor,Poverty Monitoring&Analysis,Poverty Impact Evaluation,Inequality,Poverty Diagnostics

    Crawling Deep Web using a GA-based set covering algorithm

    Get PDF
    An ever-increasing amount of information on the web today is available only through search interfaces: the users have to type in a set of keywords in a search form in order to access the pages from certain web sites. These pages are often referred to as the Hidden Web or the Deep Web. According to recent studies, the content provided by hidden web sites is often of very high quality and can be extremely valuable to many users. This calls for deep web crawlers to excavate the data so that they can be reused, indexed, and searched upon in an integrated environment. Crawling deep web is the process of collecting data from search interfaces by issuing queries. It often requires the selection of an appropriate set of queries so that they can cover most of the documents in the data source with low cost. This can be modeled as a set covering problem which has been extensively studied in graph theory. The conventional set covering algorithms, however, do not work well when applied to deep web crawling due to various special features of this application domain. Typically, most set covering algorithms do not take into account the distribution of the elements being covered. For deep web crawling, the sizes of the documents and the document frequency of the queries follow the power law distribution. A new GA-based algorithm is introduced in this thesis. It targets at deep web crawling of a database with this power law distribution. The experiment shows that it outperforms the straightforward greedy algorithm previously introduced to the literature

    A Simplified Min-Sum Decoding Algorithm for Non-Binary LDPC Codes

    Full text link
    Non-binary low-density parity-check codes are robust to various channel impairments. However, based on the existing decoding algorithms, the decoder implementations are expensive because of their excessive computational complexity and memory usage. Based on the combinatorial optimization, we present an approximation method for the check node processing. The simulation results demonstrate that our scheme has small performance loss over the additive white Gaussian noise channel and independent Rayleigh fading channel. Furthermore, the proposed reduced-complexity realization provides significant savings on hardware, so it yields a good performance-complexity tradeoff and can be efficiently implemented.Comment: Partially presented in ICNC 2012, International Conference on Computing, Networking and Communications. Accepted by IEEE Transactions on Communication

    IS 218-001: Building Web Applications

    Get PDF

    IS 218-002: Building Web Applications

    Get PDF
    • …
    corecore