38,600 research outputs found

    Grappling with Issues of Learning Science from Everyday Experiences: An Illustrative Case Study

    Get PDF
    There are different perceptions among researchers with regard to the infusion of everyday experience in the teaching of science: 1) it hinders the learning of science concepts; or, 2) it increases the participation and motivation of students in science learning. This article attempts to contemplate those different perspectives of everyday knowledge in science classrooms by using everyday contexts to teach grade 3 science in Singapore. In this study, two groups of grade 3 students were presented with a scenario that required them to apply the concept of properties of materials to design a shoe. Subsequently, the transcripts of classroom discussions and interactions were analyzed using the framework of sociocultural learning and an interpretative analytic lens. Our analysis suggests that providing an authentic everyday context is insufficient to move young learners of science from their everyday knowledge to scientific knowledge. Further, group interactions among young learners of science to solve an everyday issue need to be scaffolded to ensure meaningful, focused, and sustained learning. Implications for research in science learning among younger students are discussed

    Scaling Exponent and Moderate Deviations Asymptotics of Polar Codes for the AWGN Channel

    Full text link
    This paper investigates polar codes for the additive white Gaussian noise (AWGN) channel. The scaling exponent μ\mu of polar codes for a memoryless channel qYXq_{Y|X} with capacity I(qYX)I(q_{Y|X}) characterizes the closest gap between the capacity and non-asymptotic achievable rates in the following way: For a fixed ε(0,1)\varepsilon \in (0, 1), the gap between the capacity I(qYX)I(q_{Y|X}) and the maximum non-asymptotic rate RnR_n^* achieved by a length-nn polar code with average error probability ε\varepsilon scales as n1/μn^{-1/\mu}, i.e., I(qYX)Rn=Θ(n1/μ)I(q_{Y|X})-R_n^* = \Theta(n^{-1/\mu}). It is well known that the scaling exponent μ\mu for any binary-input memoryless channel (BMC) with I(qYX)(0,1)I(q_{Y|X})\in(0,1) is bounded above by 4.7144.714, which was shown by an explicit construction of polar codes. Our main result shows that 4.7144.714 remains to be a valid upper bound on the scaling exponent for the AWGN channel. Our proof technique involves the following two ideas: (i) The capacity of the AWGN channel can be achieved within a gap of O(n1/μlogn)O(n^{-1/\mu}\sqrt{\log n}) by using an input alphabet consisting of nn constellations and restricting the input distribution to be uniform; (ii) The capacity of a multiple access channel (MAC) with an input alphabet consisting of nn constellations can be achieved within a gap of O(n1/μlogn)O(n^{-1/\mu}\log n) by using a superposition of logn\log n binary-input polar codes. In addition, we investigate the performance of polar codes in the moderate deviations regime where both the gap to capacity and the error probability vanish as nn grows. An explicit construction of polar codes is proposed to obey a certain tradeoff between the gap to capacity and the decay rate of the error probability for the AWGN channel.Comment: 24 page

    Strong Converse Theorems for Classes of Multimessage Multicast Networks: A R\'enyi Divergence Approach

    Full text link
    This paper establishes that the strong converse holds for some classes of discrete memoryless multimessage multicast networks (DM-MMNs) whose corresponding cut-set bounds are tight, i.e., coincide with the set of achievable rate tuples. The strong converse for these classes of DM-MMNs implies that all sequences of codes with rate tuples belonging to the exterior of the cut-set bound have average error probabilities that necessarily tend to one (and are not simply bounded away from zero). Examples in the classes of DM-MMNs include wireless erasure networks, DM-MMNs consisting of independent discrete memoryless channels (DMCs) as well as single-destination DM-MMNs consisting of independent DMCs with destination feedback. Our elementary proof technique leverages properties of the R\'enyi divergence.Comment: Submitted to IEEE Transactions on Information Theory, Jul 18, 2014. Revised on Jul 31, 201
    corecore