465 research outputs found

    Formal Verification throughout the Development of Robust Systems

    Get PDF
    As transistors are becomming smaller and smaller, they become more susceptible to transient faults due to radiation. A system can be modified to handle these faults and prevent errors that are visible from outside. We present a formal method for equivalence checking to verify that this modification does not change the nominal behavior of the system. On the other hand, we contribute an algorithm to formally verify that a circuit is robust against transient faults under all possible input assignments and variability. If equivalence or robustness cannot be shown, a counterexample is generated

    SystemC Through the Looking Glass : Non-Intrusive Analysis of Electronic System Level Designs in SystemC

    Get PDF
    Due to the ever increasing complexity of hardware and hardware/software co-designs, developers strive for higher levels of abstractions in the early stages of the design flow. To address these demands, design at the Electronic System Level (ESL) has been introduced. SystemC currently is the de-facto standard for ESL design. The extraction of data from system designs written in SystemC is thereby crucial e.g. for the proper understanding of a given system. However, no satisfactory support of reflection/introspection of SystemC has been provided yet. Previously proposed methods for this purpose %introduced to achieve the goal nonetheless either focus on static aspects only, restrict the language means of SystemC, or rely on modifications of the compiler and/or parser. In this thesis, approaches that overcome these limitations are introduced, allowing the extraction of information from a given SystemC design without changing the SystemC library or the compiler. The proposed approaches retrieve both, static and dynamic (i.e. run-time) information

    Code Switching in Teaching English to Speakers of Other Languages

    Get PDF
    One of the most controversial issues in foreign language teaching and learning over many years has been the role of the students’ L1 in L2 target language education. While a monolingual approach prohibited the use of the target language in L2 classroom, researchers have reexamined the issues related to the use of students’ L1 through code switching in the L2 classroom since the 1990s. The results of these studies have shown that the L1, if used properly and judiciously, may serve important functions for the learning process and social environment of the classroom. The purpose of this study was a systematic literature review of this research for the preparation of a guidebook as to the functions, manner, reasons, and contributions of code switching as a part of 2L English language teaching

    A History and Contemporary Analysis of the Standard Adult High School Diploma Program in Minnesota

    Get PDF
    This research project, which focused on a competency-based, secondary credential option for adult learners, known as the Minnesota Standard Adult High School Diploma, investigated the development of this credentialing option from its inception prior to 2014 legislation through present implementation. Three research questions, which centered on the circumstances that instigated, the process that developed the implementation of the Minnesota Standard Adult High School Diploma, which is referred to as the quintain in this study, and the perceived impact upon Adult Basic Education programs guided this multiple case study. Group interviews and document analysis were employed to answer the aforementioned questions. Documentary evidence suggested that scores of education professionals and stakeholders contributed to the Minnesota Standard Adult High School Diploma’s evolution. Despite this evolutionary complexity, several assertions associated with this quintain lend themselves to best practices within the discipline of adult education regarding andragogical factors, education policy development and program flexibility

    Simulation Experiments as a Causal Problem

    Full text link
    Simulation methods are among the most ubiquitous methodological tools in statistical science. In particular, statisticians often is simulation to explore properties of statistical functionals in models for which developed statistical theory is insufficient or to assess finite sample properties of theoretical results. We show that the design of simulation experiments can be viewed from the perspective of causal intervention on a data generating mechanism. We then demonstrate the use of causal tools and frameworks in this context. Our perspective is agnostic to the particular domain of the simulation experiment which increases the potential impact of our proposed approach. In this paper, we consider two illustrative examples. First, we re-examine a predictive machine learning example from a popular textbook designed to assess the relationship between mean function complexity and the mean-squared error. Second, we discuss a traditional causal inference method problem, simulating the effect of unmeasured confounding on estimation, specifically to illustrate bias amplification. In both cases, applying causal principles and using graphical models with parameters and distributions as nodes in the spirit of influence diagrams can 1) make precise which estimand the simulation targets , 2) suggest modifications to better attain the simulation goals, and 3) provide scaffolding to discuss performance criteria for a particular simulation design.Comment: 19 pages, 17 figures. Under review at Statistical Scienc

    Control of Finite-State, Finite Memory Stochastic Systems

    Get PDF
    A generalized problem of stochastic control is discussed in which multiple controllers with different data bases are present. The vehicle for the investigation is the finite state, finite memory (FSFM) stochastic control problem. Optimality conditions are obtained by deriving an equivalent deterministic optimal control problem. A FSFM minimum principle is obtained via the equivalent deterministic problem. The minimum principle suggests the development of a numerical optimization algorithm, the min-H algorithm. The relationship between the sufficiency of the minimum principle and the informational properties of the problem are investigated. A problem of hypothesis testing with 1-bit memory is investigated to illustrate the application of control theoretic techniques to information processing problems

    Form Explanation in Modification of Listening Input in L2 Vocabulary Learning

    Get PDF
    The effectiveness of vocabulary explanation as modifications of listening input - explicit (EE) and implicit (IE) - were investigated in contrast to unmodified (baseline, BL) condition. One hundred and nine university students from Japan listened to two texts, which included different vocabulary elaborations for 12 items. Students listened three times to each text. After each listening, they indicatec the meanings of the items. Four weeks later, a delayed posttest was administered. Positive effects of multiple listenings were found in vocabulary learning from listening input. As hypothesized, the EE condition resulted in significant superiority over the other two on the immediate posttests. However, IE was not significantly better than the BL. The findings suggested that the IE mostly remained unnoticed during the listening. On the delayed posttest, the score of EE dropped and there was no significant difference among the three conditions, though all conditions resulted in a significant increase from the pretest

    The impact of computer interface design on Saudi students’ performance on a L2 reading test

    Get PDF
    A thesis submitted to the University of Bedfordshire in partial fulfillment of the requirements for the degree of Doctor of PhilosophyThis study investigates the effect of testing mode on lower-level Saudi Arabian test-takers’ performance and cognitive processes when taking an L2 reading test on computer compared to its paper-based counterpart from an interface design perspective. An interface was developed and implemented into the computer-based version of the L2 reading test in this study, which was administered to 102 Saudi Arabian University students for quantitative analyses and to an additional eighteen for qualitative analyses. All participants were assessed on the same L2 reading test in two modes on two separate occasions in a within-subject design. Statistical tests such as correlations, group comparisons, and item analyses were employed to investigate test-mode effect on test-takers’ performance whereas test-takers’ concurrent verbalizations were recorded when taking the reading test to investigate their cognitive processes. Strategies found in both modes were compared through their frequency of occurrence. In addition, a qualitative illustration of test-takers cognitive behavior was given to describe the processes when taking a lower-level L2 reading test. A mixed-method approach was adhered to when collecting data consisting of questionnaires think-aloud protocols, and post-experimental interviews as main data collection instruments. Results on test-takers’ performance showed that there was no significant difference between the two modes of testing on overall reading performance, however, item level analyses discovered significant differences on two of the test’s items. Further qualitative investigation into possible interface design related causes for these differences showed no identifiable relationship between test-takers’ performance and the computer-based testing mode. Results of the cognitive processes analyses showed significant differences in three out of the total number of cognitive processes employed by test-takers indicating that test-takers had more difficulties in processing text in the paper-based test than in the computer-based test. Both product and process analyses carried out further provided convincing supporting evidence for the cognitive validity, content validity, and context validity contributing to the construct validity of the computer-based test used in this study

    DEVELOPMENT OF AN INSTRUMENT FOR ASSESSING CULTURALLY CONGRUENT SCIENCE TEACHING

    Get PDF
    Demographers forecast that ethnic minority students will make up the majority of students in America’s K-12 schools sometime in the next few decades. Yet most ethnic minority students continue to experience a lower level of achievement compared to their White peers. Emerging research indicates that culturally congruent instruction (CCI) is correlated with improved ethnic minority student achievement and so may be one means to close the achievement differential. Calls for more research in CCI are increasing, yet measuring CCI is challenging due to its context specific nature and abstract elements that are difficult to define and operationalize. This study responded to the need for improved assessment of CCI through the investigation of two research questions: What is a culturally congruent process for developing a valid instrument for assessing the use of CCI in teaching science with Montana American Indian students? and What is the technical quality of such an instrument? Investigating these questions resulted in (a) a culturally congruent instrument development model that utilized participatory methods and involved numerous and diverse stakeholders, (b) a model of CCI composed of three major elements (content, pedagogy, and environment), (c) a teacher self report survey known as the Revised Culturally Congruent Instruction Survey, and (d) a substantive body of evidence for the use of the instrument to draw valid inferences regarding CCI. While the context specific nature of CCI means that the Revised CCIS will likely require adaptation if used in contexts outside of the one for which it was designed, it holds significance to the research and education community in providing a template for the operationalization of CCI and its assessment. Likewise, the development process model, in demonstrating the use of culturally congruent practices to equitably engage stakeholders in instrument development, has potential value as a resource for guiding those looking to work with communities to develop a similar instrument. Both the instrument and development model have potential to move the research base forward regarding CCI, worthwhile goals that may assist in the attainment of equitable educational outcomes for all students
    • 

    corecore