6,520 research outputs found

    Why should governments intervene in education, and how effective is education policy

    Get PDF
    This paper reviews arguments for government interference in the education sector and discusses the effectiveness of commonly used policy instruments. There are both efficiency and equity reasons for government intervention. Particular attention is paid to education spillovers (an efficiency motive). The empirical literature shows that there is little reason to argue for additional policy efforts to correct for externalities. There is some promising evidence, however, for non-pecuniary spillovers in the form of crime reduction and health improvements. With regard to the effectiveness of policy instruments, the paper discusses studies with a (quasi-)experimental design so that the causal impact of the policy can be identified. Early childhood interventions appear to be more effective than interventions in later stages of the education cycle.

    System-on-chip Computing and Interconnection Architectures for Telecommunications and Signal Processing

    Get PDF
    This dissertation proposes novel architectures and design techniques targeting SoC building blocks for telecommunications and signal processing applications. Hardware implementation of Low-Density Parity-Check decoders is approached at both the algorithmic and the architecture level. Low-Density Parity-Check codes are a promising coding scheme for future communication standards due to their outstanding error correction performance. This work proposes a methodology for analyzing effects of finite precision arithmetic on error correction performance and hardware complexity. The methodology is throughout employed for co-designing the decoder. First, a low-complexity check node based on the P-output decoding principle is designed and characterized on a CMOS standard-cells library. Results demonstrate implementation loss below 0.2 dB down to BER of 10^{-8} and a saving in complexity up to 59% with respect to other works in recent literature. High-throughput and low-latency issues are addressed with modified single-phase decoding schedules. A new "memory-aware" schedule is proposed requiring down to 20% of memory with respect to the traditional two-phase flooding decoding. Additionally, throughput is doubled and logic complexity reduced of 12%. These advantages are traded-off with error correction performance, thus making the solution attractive only for long codes, as those adopted in the DVB-S2 standard. The "layered decoding" principle is extended to those codes not specifically conceived for this technique. Proposed architectures exhibit complexity savings in the order of 40% for both area and power consumption figures, while implementation loss is smaller than 0.05 dB. Most modern communication standards employ Orthogonal Frequency Division Multiplexing as part of their physical layer. The core of OFDM is the Fast Fourier Transform and its inverse in charge of symbols (de)modulation. Requirements on throughput and energy efficiency call for FFT hardware implementation, while ubiquity of FFT suggests the design of parametric, re-configurable and re-usable IP hardware macrocells. In this context, this thesis describes an FFT/IFFT core compiler particularly suited for implementation of OFDM communication systems. The tool employs an accuracy-driven configuration engine which automatically profiles the internal arithmetic and generates a core with minimum operands bit-width and thus minimum circuit complexity. The engine performs a closed-loop optimization over three different internal arithmetic models (fixed-point, block floating-point and convergent block floating-point) using the numerical accuracy budget given by the user as a reference point. The flexibility and re-usability of the proposed macrocell are illustrated through several case studies which encompass all current state-of-the-art OFDM communications standards (WLAN, WMAN, xDSL, DVB-T/H, DAB and UWB). Implementations results are presented for two deep sub-micron standard-cells libraries (65 and 90 nm) and commercially available FPGA devices. Compared with other FFT core compilers, the proposed environment produces macrocells with lower circuit complexity and same system level performance (throughput, transform size and numerical accuracy). The final part of this dissertation focuses on the Network-on-Chip design paradigm whose goal is building scalable communication infrastructures connecting hundreds of core. A low-complexity link architecture for mesochronous on-chip communication is discussed. The link enables skew constraint looseness in the clock tree synthesis, frequency speed-up, power consumption reduction and faster back-end turnarounds. The proposed architecture reaches a maximum clock frequency of 1 GHz on 65 nm low-leakage CMOS standard-cells library. In a complex test case with a full-blown NoC infrastructure, the link overhead is only 3% of chip area and 0.5% of leakage power consumption. Finally, a new methodology, named metacoding, is proposed. Metacoding generates correct-by-construction technology independent RTL codebases for NoC building blocks. The RTL coding phase is abstracted and modeled with an Object Oriented framework, integrated within a commercial tool for IP packaging (Synopsys CoreTools suite). Compared with traditional coding styles based on pre-processor directives, metacoding produces 65% smaller codebases and reduces the configurations to verify up to three orders of magnitude

    Models of atypical development must also be models of normal development

    Get PDF
    Functional magnetic resonance imaging studies of developmental disorders and normal cognition that include children are becoming increasingly common and represent part of a newly expanding field of developmental cognitive neuroscience. These studies have illustrated the importance of the process of development in understanding brain mechanisms underlying cognition and including children ill the study of the etiology of developmental disorders

    Are developmental disorders like cases of adult brain damage? Implications from connectionist modelling

    Get PDF
    It is often assumed that similar domain-specific behavioural impairments found in cases of adult brain damage and developmental disorders correspond to similar underlying causes, and can serve as convergent evidence for the modular structure of the normal adult cognitive system. We argue that this correspondence is contingent on an unsupported assumption that atypical development can produce selective deficits while the rest of the system develops normally (Residual Normality), and that this assumption tends to bias data collection in the field. Based on a review of connectionist models of acquired and developmental disorders in the domains of reading and past tense, as well as on new simulations, we explore the computational viability of Residual Normality and the potential role of development in producing behavioural deficits. Simulations demonstrate that damage to a developmental model can produce very different effects depending on whether it occurs prior to or following the training process. Because developmental disorders typically involve damage prior to learning, we conclude that the developmental process is a key component of the explanation of endstate impairments in such disorders. Further simulations demonstrate that in simple connectionist learning systems, the assumption of Residual Normality is undermined by processes of compensation or alteration elsewhere in the system. We outline the precise computational conditions required for Residual Normality to hold in development, and suggest that in many cases it is an unlikely hypothesis. We conclude that in developmental disorders, inferences from behavioural deficits to underlying structure crucially depend on developmental conditions, and that the process of ontogenetic development cannot be ignored in constructing models of developmental disorders

    Group project work from the outset: an in-depth teaching experience report

    Get PDF
    This article is an extended version of a paper that was submitted to 24th IEEE Conference on Software Engineering Education and Training, Honolulu, May 2011CONTEXT - we redesigned our undergraduate computing programmes to address problems of motivation and outdated content. METHOD - the primary vehicle for the new curriculum was the group project which formed a central spine for the entire degree right from the first year. RESULTS - so far this programme has been successfully run once. Failures, drop outs and students required to retake modules have been halved (from an average of 21.6% from the previous 4 years to 9.5%) and students obtaining the top two grades have increased from 25.2% to 38.9%. CONCLUSIONS - whilst we cannot be certain that all improvement is due to the group projects informally the change has been well received, however, we are looking for areas to improve including the possibility of more structured support for student metacognitive awareness

    Infrastructure and growth in developing countries : recent advances and research challenges

    Get PDF
    This paper presents a survey of recent research on the economics of infrastructure in developing countries. Energy, transport, telecommunications, water and sanitation are considered. The survey covers two main set of issues: the linkages between infrastructure and economic growth (at the economy-wide, regional and sectoral level) and the composition, sequencing and efficiency of alternative infrastructure investments, including thearbitrage between new investments and maintenance expenditures; OPEX and CAPEX, and public versus private investment. Following the introduction, section 2 discusses the theoretical foundations (growth theory and new economic geography). Section 3 assesses the analysis of 140 specifications from 64 recent empirical papers-examining type of data used, level of aggregation, econometric techniques and nature of the sample-and discusses both the macro-econometric and micro-econometric contributions of these papers. Finally section 4 discusses directions for future research and suggests priorities in data development.Transport Economics Policy&Planning,Economic Theory&Research,Non Bank Financial Institutions,Infrastructure Economics,Political Economy

    Infrastructure and Growth in Developing Countries: Recent Advances and Research Challenges

    Get PDF
    This paper presents a survey of recent research on the economics of infrastructure in developing countries. Energy, transport, telecommunications, water and sanitation are considered. The survey covers two main set of issues: the linkages between infrastructure and economic growth (at the economy-wide, regional and sectoral level) and the composition, sequencing and efficiency of alternative infrastructure investments, including the arbitrage between new investments and maintenance expenditures; OPEX and CAPEX, and public versus private investment. Following the introduction, section 2 discusses the theoretical foundations (growth theory and new economic geography). Section 3 assesses the analysis of 140 specifications from 64 recent empirical papers examining type of data used, level of aggregation, econometric techniques and nature of the sample and discusses both the macro-econometric and microeconometric contributions of these papers. Finally section 4 discusses directions for future research and suggests priorities in data development.
    corecore