1,228 research outputs found

    Algebra knowledge in early elementary school supporting later mathematics ability.

    Get PDF
    The current study explored the impact that algebra knowledge in 1st and 2nd grade can have in growth and achievement in mathematics through 5th grade. Previous studies have shown the positive impact of teaching algebra in middle school (e.g. Smith, 1996). Other studies have shown that students can learn and apply fundamental algebra concepts even earlier, including early elementary grades (e.g. Schifter et al., 2008; Brizuela and Earnest, 2008). The current study aimed to expand upon this research by showing students\u27 knowledge of early algebra concepts can predict positive longitudinal outcomes. This would support cognitive and education theories that students can use algebraic concepts to structure their overall mathematics knowledge. The current study utilized an archival dataset with five years of student data from one district. District assessments measured student knowledge of algebra in 1st and 2nd grade. Students\u27 standardized mathematics test scores and district assessments for mathematics were collected for 3rd, 4th, and 5th grade. Algebra knowledge in 1st and 2nd grade predicted students\u27 mathematics ability on the state standardized assessment in 5th grade. It also predicted growth in scores from 3rd through 5th grade. Algebra was a significant predictor in a model that included students\u27 abilities in other areas of mathematics, reading ability, and race. The model also included school level socioeconomic data. Parallel models were done using the district assessments in 3rd through 5th grade as the outcome measure. Algebra knowledge in 1st and 2nd grade was a significant predictor of 5th grade mathematics knowledge on these assessments. Algebra knowledge did not predict growth from 3rd through 5th grade. Overall, this study underlines the importance of including algebra in early elementary teaching, standards, and assessment. Early algebra may help students structure their mathematics knowledge from the beginning of their education. This can lead to improved longitudinal mathematics knowledge and performance

    Paper Session I-A - The Navy Nuclear Program as an Analogue Long Duration, Nuclear Powered, Manned Space Missions

    Get PDF
    During the past five decades, the US Navy has successfully operated a number of nuclear thermal propulsion systems with the characteristics similar to those required for long duration, nuclear powered, space missions. If nuclear reactor\u27s are to be utilized for space propulsion, they will embody many characteristics such, as size, mobility, environmental security, crew safety, and long-duration independent-operation capabilities which have already been demonstrated by their Navy counterparts. The authors present a brief overview of both Project ROVER, NASA\u27s most extensive nuclear propulsion program to date, which resulted in a total firing tine of 1,020 minutes at power levels above 1.0 megawatt, This is contrasted with Navy operational nuclear reactor experience for significantly • longer periods of time at high average power levels, Technical issues central to the operation of Navy nuclear reactors which arc directly applicable to nuclear powered , manned, space missions are explored. The Navy \u27 s nearly perfect safety record, enviable environmental record, as well as significant design, and operational experience achieved during approximately 3 , 800 reactor-years of operation make its experience and, corporate opinion both authoritative and convincing in nuclear matters while providing a data base of extreme value which should not be ignored in the development of future space nuclear systems

    On Chiral Symmetry Restoration at Finite Density in Large N_c QCD

    Full text link
    At large N_c, cold nuclear matter is expected to form a crystal and thus spontaneously break translational symmetry. The description of chiral symmetry breaking and translational symmetry breaking can become intertwined. Here, the focus is on aspects of chiral symmetry breaking and its possible restoration that are by construction independent of the nature of translational symmetry breaking---namely spatial averages of chiral order parameters. A system will be considered to be chirally restored provided all spatially-averaged chiral order parameters are zero. A critical question is whether chiral restoration in this sense is possible for phases in which chiral order parameters are locally non-zero but whose spatial averages all vanish. We show that this is not possible unless all chirally-invariant observables are spatially uniform. This result is first derived for Skyrme-type models, which are based on a nonlinear sigma model and by construction break chiral symmetry on a point-by-point basis. A no-go theorem for chiral restoration (in the average sense) for all models of this type is obtained by showing that in these models there exist chirally symmetric order parameters which cannot be spatially uniform. Next we show that the no-go theorem applies to large N_c QCD in any phase which has a non-zero but spatially varying chiral condensate. The theorem is demonstrated by showing that in a putative chirally-restored phase, the field configuration can be reduced to that of a nonlinear sigma model.Comment: 12 pages, 1 tabl

    The Effect of Interpolation of Performance Test Items on Stanford-Binet Scores

    Get PDF
    The clinical psychologist frequently wishes to obtain both performance and verbal I.Q.\u27s on a child at a single session. To maintain interest and rapport, examiners have sometimes adopted the practice of interpolating the performance sub-tests between the verbal test levels. Since this constitutes a deviation from the conditions under which the Binet test has been standardized, the question has arisen as to whether this practice affects the scores obtained on the two tests. The present experiment was designed to provide an answer to this question

    Perspectives for Positron Emission Tomography with RPCs

    Get PDF
    In this study we address the feasibility and main properties of a positron emission tomograph (PET) based on RPCs. The concept, making use of the converter-plate principle, takes advantage of the intrinsic layered structure of RPCs and its simple and economic construction. The extremely good time and position resolutions of RPCs also allow the TOF-PET imaging technique to be considered. Monte-Carlo simulations, supported by experimental data, are presented and the main advantages and drawbacks for applications of potential interest are discussed.Comment: Presented at "RPC2001-VI Workshop on Resistive Plate Chambers and Related Detectors", Coimbra, Portugal, 26-27 November 2001 (5 pages

    Parallelization of Kinetic Theory Simulations

    Full text link
    Numerical studies of shock waves in large scale systems via kinetic simulations with millions of particles are too computationally demanding to be processed in serial. In this work we focus on optimizing the parallel performance of a kinetic Monte Carlo code for astrophysical simulations such as core-collapse supernovae. Our goal is to attain a flexible program that scales well with the architecture of modern supercomputers. This approach requires a hybrid model of programming that combines a message passing interface (MPI) with a multithreading model (OpenMP) in C++. We report on our approach to implement the hybrid design into the kinetic code and show first results which demonstrate a significant gain in performance when many processors are applied.Comment: 10 pages, 3 figures, conference proceeding

    By Shepherd, et all, posted on November 29th, 2013 in Articles, Climate

    Get PDF
    Earth is increasingly an “urbanized ” planet. The “World Population Clock ” registered a Population of 7,175,309,538 at 8:30 pm (LST) on Oct. 6, 2013. Current and future trends suggest that this population will increasingly reside in cities. Currently, 52 percent of the world population is urban, which means we are a majority “urbanized ” society. Figure 1 indicates this trend will continue, wit

    Efficient execution in an automated reasoning environment

    Get PDF
    We describe a method that permits the user of a mechanized mathematical logic to write elegant logical definitions while allowing sound and efficient execution. In particular, the features supporting this method allow the user to install, in a logically sound way, alternative executable counterparts for logically defined functions. These alternatives are often much more efficient than the logically equivalent terms they replace. These features have been implemented in the ACL2 theorem prover, and we discuss several applications of the features in ACL2.Ministerio de Educación y Ciencia TIN2004–0388

    Can we explain machine learning-based prediction for rupture status assessments of intracranial aneurysms?

    Get PDF
    Although applying machine learning (ML) algorithms to rupture status assessment of intracranial aneurysms (IA) has yielded promising results, the opaqueness of some ML methods has limited their clinical translation. We presented the first explainability comparison of six commonly used ML algorithms: multivariate logistic regression (LR), support vector machine (SVM), random forest (RF), extreme gradient boosting (XGBoost), multi-layer perceptron neural network (MLPNN), and Bayesian additive regression trees (BART). A total of 112 IAs with known rupture status were selected for this study. The ML-based classification used two anatomical features, nine hemodynamic parameters, and thirteen morphologic variables. We utilized permutation feature importance, local interpretable model-agnostic explanations (LIME), and SHapley Additive exPlanations (SHAP) algorithms to explain and analyze 6 Ml algorithms. All models performed comparably: LR area under the curve (AUC) was 0.71; SVM AUC was 0.76; RF AUC was 0.73; XGBoost AUC was 0.78; MLPNN AUC was 0.73; BART AUC was 0.73. Our interpretability analysis demonstrated consistent results across all the methods; i.e., the utility of the top 12 features was broadly consistent. Furthermore, contributions of 9 important features (aneurysm area, aneurysm location, aneurysm type, wall shear stress maximum during systole, ostium area, the size ratio between aneurysm width, (parent) vessel diameter, one standard deviation among time-averaged low shear area, and one standard deviation of temporally averaged low shear area less than 0.4 Pa) were nearly the same. This research suggested that ML classifiers can provide explainable predictions consistent with general domain knowledge concerning IA rupture. With the improved understanding of ML algorithms, clinicians’ trust in ML algorithms will be enhanced, accelerating their clinical translation
    corecore