65,217 research outputs found

    Optimal design of water distribution systems based on entropy and topology

    Get PDF
    A new multi-objective evolutionary optimization approach for joint topology and pipe size design of water distribution systems is presented. The algorithm proposed considers simultaneously the adequacy of flow and pressure at the demand nodes; the initial construction cost; the network topology; and a measure of hydraulic capacity reliability. The optimization procedure is based on a general measure of hydraulic performance that combines statistical entropy, network connectivity and hydraulic feasibility. The topological properties of the solutions are accounted for and arbitrary assumptions regarding the quality of infeasible solutions are not applied. In other words, both feasible and infeasible solutions participate in the evolutionary processes; solutions survive and reproduce or perish strictly according to their Pareto-optimality. Removing artificial barriers in this way frees the algorithm to evolve optimal solutions quickly. Furthermore, any redundant binary codes that result from crossover or mutation are eliminated gradually in a seamless and generic way that avoids the arbitrary loss of potentially useful genetic material and preserves the quality of the information that is transmitted from one generation to the next. The approach proposed is entirely generic: we have not introduced any additional parameters that require calibration on a case-by-case basis. Detailed and extensive results for two test problems are included that suggest the approach is highly effective. In general, the frontier-optimal solutions achieved include topologies that are fully branched, partially- and fully-looped and, for networks with multiple sources, completely separate sub-networks

    Rate Splitting for MIMO Wireless Networks: A Promising PHY-Layer Strategy for LTE Evolution

    Get PDF
    MIMO processing plays a central part towards the recent increase in spectral and energy efficiencies of wireless networks. MIMO has grown beyond the original point-to-point channel and nowadays refers to a diverse range of centralized and distributed deployments. The fundamental bottleneck towards enormous spectral and energy efficiency benefits in multiuser MIMO networks lies in a huge demand for accurate channel state information at the transmitter (CSIT). This has become increasingly difficult to satisfy due to the increasing number of antennas and access points in next generation wireless networks relying on dense heterogeneous networks and transmitters equipped with a large number of antennas. CSIT inaccuracy results in a multi-user interference problem that is the primary bottleneck of MIMO wireless networks. Looking backward, the problem has been to strive to apply techniques designed for perfect CSIT to scenarios with imperfect CSIT. In this paper, we depart from this conventional approach and introduce the readers to a promising strategy based on rate-splitting. Rate-splitting relies on the transmission of common and private messages and is shown to provide significant benefits in terms of spectral and energy efficiencies, reliability and CSI feedback overhead reduction over conventional strategies used in LTE-A and exclusively relying on private message transmissions. Open problems, impact on standard specifications and operational challenges are also discussed.Comment: accepted to IEEE Communication Magazine, special issue on LTE Evolutio

    Testability enhancement of a basic set of CMOS cells

    Get PDF
    Testing should be evaluated as the ability of the test patterns to cover realistic faults, and high quality IC products demand high quality testing. We use a test strategy based on physical design for testability (to discover both open and short faults, which are difficult or even impossible to detect). Consequentially, layout level design for testability (LLDFT) rules have been developed, which prevent the faults, or at least reduce the chance of their appearing. The main purpose of this work is to apply a practical set of LLDFT rules to the library cells designed by the Centre Nacional de MicroelectrĂČnica (CNM) and obtain a highly testable cell library. The main results of the application of the LLDFT rules (area overheads and performance degradation) are summarized and the results are significant since IC design is highly repetitive; a small effort to improve cell layout can bring about great improvement in design

    Limits on Fundamental Limits to Computation

    Full text link
    An indispensable part of our lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the last fifty years. Such Moore scaling now requires increasingly heroic efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and enrich our understanding of integrated-circuit scaling, we review fundamental limits to computation: in manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, we recall how some limits were circumvented, compare loose and tight limits. We also point out that engineering difficulties encountered by emerging technologies may indicate yet-unknown limits.Comment: 15 pages, 4 figures, 1 tabl

    A path model “why-what-how-when” to Implement an IC reporting

    Get PDF
    The purpose of this paper is to present the results of an empirical study and the critical success factors for implementing Intellectual Capital (IC) reporting. Selecting an IC model to be implemented in a specific context at a particular time depends on several contingent factors. In light of this, we propose the following “why-what-how-when” agenda, which will be applied in the case study: 1. Why implement IC reporting in a specific context? 2. What IC approach/tool is suitable to satisfy users’ informational needs? 3. How is the quality of information? 4. When is information available? The research is qualitative and focused on a case study in order to understand the dynamics of a given process. The company analyzed designs and develops Large Systems for Homeland Protection. The analyzed case study shows that there isn’t “one best way” to report on intangibles. Thus, the main critical factors of the process investigated are the following: accurate identification of actors involved in the decision-making process; quality and availability of information. The case study allows us to analyze how changes in decision maker(s), users’ informational needs and information quality can impact the selection of the framework and its relative artifact/tool to be used to report on intangibles

    Evaluation of crime prevention initiatives

    Get PDF
    This third toolbox in the series published by the EUCPN Secretariat focuses on the main theme of the Irish Presidency, which is the evaluation of crime prevention initiatives. The theme is explored and elaborated in various ways through: a literature review; two workshops with international experts and practitioners during which the strengths and weaknesses of programme evaluation were discussed in detail; a screening of existing guidelines and manuals on evaluation; and finally, a call which was launched by the EUCPN Secretariat to the Member States to collect some practices on the evaluation of crime prevention initiatives
    • 

    corecore