164 research outputs found

    Assessing Inter-rater Agreement for Compositional Data

    Get PDF
    Compositional data are non-negative vectors whose elements sum to one (e.g., [0.1, 0.5, 0.4]). This type of data occurs in many research areas where the relative magnitudes between the vector’s elements are of primary interest. In this dissertation we propose novel methodology for assessing inter-rate agreement based on compositional data. This is needed because existing agreement measures either involve converting the vector to a univariate value, thereby losing information, or they fail to account for the sum-to-one restriction. We propose a novel Bayesian approach, enabled by Markov chain Monte Carlo, to investigate differences in the pattern of compositional vector scores. We extend our model to handle discrete compositional scores, comparisons involving more than two raters, and studies that involve replicate scores on the same subjects. Numerous simulation studies are used to demonstrate the validity of our model and the advantages of our approach. Both simulated data and a real scoring data set are analyzed to illustrate our method and compare it to traditional agreement indices. The application of this new methodology is focused on pathology, where pathologists rate immunohistochemistry (IHC) assays using compositional scores. To enhance the use of this methodology and help with the design of future agreement studies, an R Shiny package designed for the IHC agreement analysis is developed

    A content analysis study of Portable Assisted Study Sequence mathematics curricular materials for migrant students using the National Council of Teachers of Mathematics Standards

    Get PDF
    The need for change in the mathematics curricula in our public schools has been well documented (Kirwan, 1990; National Commission on Excellence in Education, 1983; National Research Council, 1989; Overby, 1993). Testing surveys show low overall performance at every age throughout the K-12 levels. The Curriculum and Evaluation Standards for School Mathematics, (Standards) issued by the National Council of Teachers of Mathematics (NCTM) in 1989 are designed to move mathematics curricula forward to meet the needs of students for the future. The analysis of new curricular materials is essential in order to produce materials that meet recommended standards. Migrant students represent one segment of the student population with deficiencies in mathematics training at the K-12 level. The Portable Assisted Study Sequence (P.A.S.S.) Program serves migrant students in 165 schools in California, and must provide materials which comply with the Standards. This study analyzed and compared this compliance in two mathematics courses written in 1989 and 1995 for P.A.S.S. An evaluative instrument was designed to measure the extent to which reform ideas in the Standards are represented in the curricular materials. Content analysis procedures were used to analyze the curricula with the assistance of Nud*Ist software. Research design for the instrument included procedures for content validation and interrater reliability. The results of this study showed the 1995 P.A.S.S curricular materials measurably improve upon the 1989 curricular materials with respect to the Standards. The evaluative instrument was found to effectively and reliably measure the extent to which curricular materials meet the Standards. This study provides guidance and direction for teachers, students, curriculum developers, and future researchers at local, state, and national levels. Standards are a major component of education, and this study represents a pioneering effort to quantify the changes that can hopefully help our society meet these goals. This process can be replicated in other disciplines, which increases the potential for social change. Significant curricular reform will have significant social impact

    Towards a Robuster Interpretive Parsing

    Get PDF
    The input data to grammar learning algorithms often consist of overt forms that do not contain full structural descriptions. This lack of information may contribute to the failure of learning. Past work on Optimality Theory introduced Robust Interpretive Parsing (RIP) as a partial solution to this problem. We generalize RIP and suggest replacing the winner candidate with a weighted mean violation of the potential winner candidates. A Boltzmann distribution is introduced on the winner set, and the distribution’s parameter TT is gradually decreased. Finally, we show that GRIP, the Generalized Robust Interpretive Parsing Algorithm significantly improves the learning success rate in a model with standard constraints for metrical stress assignment

    Evaluating the success of a workplace health and wellbeing intervention using a small group of repeat-respondents from a large repeated cross-sectional survey

    Get PDF
    The Healthy@Work intervention in the Tasmanian State Service was responsible for increased availability of and participation in health and wellbeing activities, but there was little evidence of improvement in health-related factors for this group of respondents over the three year period of this study. Changes in the health-related factors were expected outcomes of the intervention but a study duration of just three years is possibly too short to allow change to be manifest

    The National Student Assessment 2022 in Bangladesh. Grades 3 and 5

    Get PDF
    The National Student Assessment (NSA) 2022 was a sample-based assessment in Bangladesh, measuring the learning levels of Bangladeshi students in Bangla and Mathematics covering 3,357,233 students in grade 3 and 3,604,586 students in grade 5. It was the seventh round of the national assessments in primary education in Bangladesh, conducted after five years and the most comprehensive national-level learning assessment after the global COVID-19 pandemic. The sampling method at school level was multistage stratified probability proportional to size method, and at student level was equal probability systematic random sampling derived from Annual Primary School Census (APSC) 2021 database.The report outlines assessment and content expectations, reliability estimates, mean scale, scores by division, district, school type, and geographical location as well as proficiency bands and performance standards, and performance by key content area and cognitive level. It includes findings relating to learning and equity plus student, teacher and headteacher questionnaires

    Reason to believe

    Get PDF

    The reverse mathematics of elementary recursive nonstandard analysis: a robust contribution to the foundations of mathematics

    Get PDF
    Reverse Mathematics (RM) is a program in the Foundations of Mathematics founded by Harvey Friedman in the Seventies ([17, 18]). The aim of RM is to determine the minimal axioms required to prove a certain theorem of ‘ordinary’ mathematics. In many cases one observes that these minimal axioms are also equivalent to this theorem. This phenomenon is called the ‘Main Theme’ of RM and theorem 1.2 is a good example thereof. In practice, most theorems of everyday mathematics are equivalent to one of the four systems WKL0, ACA0, ATR0 and Π1-CA0 or provable in the base theory RCA0. An excellent introduction to RM is Stephen Simpson’s monograph [46]. Nonstandard Analysis has always played an important role in RM. ([32,52,53]). One of the open problems in the literature is the RM of theories of first-order strength I∆0 + exp ([46, p. 406]). In Chapter I, we formulate a solution to this problem in theorem 1.3. This theorem shows that many of the equivalences from theorem 1.2 remain correct when we replace equality by infinitesimal proximity ‘≈’ from Nonstandard Analysis. The base theory now is ERNA, a nonstandard extension of I∆0 + exp. The principle that corresponds to ‘Weak K ̈onig’s lemma’ is the Universal Transfer Principle (see axiom schema 1.57). In particular, one can say that the RM of ERNA+Π1-TRANS is a ‘copy up to infinitesimals’ of the RM of WKL0. This implies that RM is ‘robust’ in the sense this term is used in Statistics and Computer Science ([25,35]). Furthermore, we obtain applications of our results in Theoretical Physics in the form of the ‘Isomorphism Theorem’ (see theorem 1.106). This philosophical excursion is the first application of RM outside of Mathematics and implies that ‘whether reality is continuous or discrete is undecidable because of the way mathematics is used in Physics’ (see paragraph 3.2.4, p. 53). We briefly explore a connection with the program ‘Constructive Reverse Mathematics’ ([30,31]) and in the rest of Chapter I, we consider the RM of ACA0 and related systems. In particular, we prove theorem 1.161, which is a first step towards a ‘copy up to infinitesimals’ of the RM of ACA0. However, one major aesthetic problem with these results is the introduction of extra quantifiers in many of the theorems listed in theorem 1.3 (see e.g. theorem 1.94). To overcome this hurdle, we explore Relative Nonstandard Analysis in Chapters II and III. This new framework involves many degrees of infinity instead of the classical ‘binary’ picture where only two degrees ‘finite’ and ‘infinite’ are available. We extend ERNA to a theory of Relative Nonstandard Analysis called ERNAA and show how this theory and its extensions allow for a completely quantifier- free development of analysis. We also study the metamathematics of ERNAA, motivated by RM. Several ERNA-theorems would not have been discovered without considering ERNAA first

    A First Course in Causal Inference

    Full text link
    I developed the lecture notes based on my ``Causal Inference'' course at the University of California Berkeley over the past seven years. Since half of the students were undergraduates, my lecture notes only require basic knowledge of probability theory, statistical inference, and linear and logistic regressions

    Mechanised metamathematics : an investigation of first-order logic and set theory in constructive type theory

    Get PDF
    In this thesis, we investigate several key results in the canon of metamathematics, applying the contemporary perspective of formalisation in constructive type theory and mechanisation in the Coq proof assistant. Concretely, we consider the central completeness, undecidability, and incompleteness theorems of first-order logic as well as properties of the axiom of choice and the continuum hypothesis in axiomatic set theory. Due to their fundamental role in the foundations of mathematics and their technical intricacies, these results have a long tradition in the codification as standard literature and, in more recent investigations, increasingly serve as a benchmark for computer mechanisation. With the present thesis, we continue this tradition by uniformly analysing the aforementioned cornerstones of metamathematics in the formal framework of constructive type theory. This programme offers novel insights into the constructive content of completeness, a synthetic approach to undecidability and incompleteness that largely eliminates the notorious tedium obscuring the essence of their proofs, as well as natural representations of set theory in the form of a second-order axiomatisation and of a fully type-theoretic account. The mechanisation concerning first-order logic is organised as a comprehensive Coq library open to usage and contribution by external users.In dieser Doktorarbeit werden einige SchlĂŒsselergebnisse aus dem Kanon der Metamathematik untersucht, unter Verwendung der zeitgenössischen Perspektive von Formalisierung in konstruktiver Typtheorie und Mechanisierung mit Hilfe des Beweisassistenten Coq. Konkret werden die zentralen VollstĂ€ndigkeits-, Unentscheidbarkeits- und UnvollstĂ€ndigkeitsergebnisse der Logik erster Ordnung sowie Eigenschaften des Auswahlaxioms und der Kontinuumshypothese in axiomatischer Mengenlehre betrachtet. Aufgrund ihrer fundamentalen Rolle in der Fundierung der Mathematik und ihrer technischen Schwierigkeiten, besitzen diese Ergebnisse eine lange Tradition der Kodifizierung als Standardliteratur und, besonders in jĂŒngeren Untersuchungen, eine zunehmende Bedeutung als Maßstab fĂŒr Mechanisierung mit Computern. Mit der vorliegenden Doktorarbeit wird diese Tradition fortgefĂŒhrt, indem die zuvorgenannten Grundpfeiler der Methamatematik uniform im formalen Rahmen der konstruktiven Typtheorie analysiert werden. Dieses Programm ermöglicht neue Einsichten in den konstruktiven Gehalt von VollstĂ€ndigkeit, einen synthetischen Ansatz fĂŒr Unentscheidbarkeit und UnvollstĂ€ndigkeit, der großteils den berĂŒchtigten, die Essenz der Beweise verdeckenden, technischen Aufwand eliminiert, sowie natĂŒrliche ReprĂ€sentationen von Mengentheorie in Form einer Axiomatisierung zweiter Ordnung und einer vollkommen typtheoretischen Darstellung. Die Mechanisierung zur Logik erster Ordnung ist als eine umfassende Coq-Bibliothek organisiert, die offen fĂŒr Nutzung und BeitrĂ€ge externer Anwender ist
    • 

    corecore