285 research outputs found

    L2 Acquisition at the interfaces: Subject-verb inversion in L2 English and its pedagogical implications

    Get PDF
    The present PhD thesis deals with two kinds of interfaces that have recently become key areas of interest in generative second language acquisition research (GenSLA): (i) linguistic interfaces – the syntax-discourse interface (our main focus of research) and the lexicon-syntax interface in adult second language (L2) acquisition –, and (ii) an interdisciplinary interface – the interface between the domains of GenSLA and L2 pedagogy. The thesis seeks to shed new light on four general questions which are still a matter of debate in GenSLA: (i) Are narrow syntactic and lexical-syntactic properties unproblematic at the end state of L2 acquisition, as the Interface Hypothesis (IH) (Sorace & Filiaci, 2006; Sorace, 2011b) predicts? (ii) Are properties at the syntax-discourse interface necessarily problematic at the end state of L2 acquisition, as the IH proposes? (iii) What are the roles of cross-linguistic influence, input and processing factors in L2 acquisition at the syntax-discourse interface? (iv) Can explicit instruction help L2 learners/speakers (L2ers) overcome persistent problems in the acquisition of syntactic and syntax-discourse properties? With a view to investigating these questions, the thesis focuses on a linguistic phenomenon that has been little researched in GenSLA: subject-verb inversion (SVI) in L2 English. Three types of SVI are considered here: (i) “free” inversion (and their correlation with null subjects), (ii) locative inversion and (iii) presentational there-constructions (i.e., there-constructions with verbs other than be). The first is ungrammatical in English due to a purely syntactic factor: this language fixes the null subject parameter at a negative value. The last two types of SVI, on the other hand, are possible in English under certain lexical, syntactic and discourse conditions. The thesis comprises two experimental studies: (i) a study on the acquisition of the lexical, syntactic and discourse properties of SVI by advanced and near-native L2ers of English who are native speakers of French (a language similar to English in the relevant respects) and European Portuguese (a language different from English in the relevant respects), and (ii) a study on the impact of explicit grammar instruction on the acquisition of “narrow” syntactic and syntax-discourse properties of SVI by intermediate and low advanced Portuguese L2ers of English. The former study tests participants by means of three types of tasks: untimed drag-and-drop tasks, syntactic priming tasks, and speeded acceptability judgement tasks. Their results confirm that, as predicted by the IH, the properties of SVI that are purely (lexical-)syntactic are unproblematic at the end state of L2 acquisition, but those which involve the interface between syntax and discourse are a locus of permanent optionality, even when the first language (L1) is similar to the L2. Results are, moreover, consistent with the prediction of the IH that the optionality found at the syntax-discourse interface is primarily caused by processing inefficiencies associated with bilingualism. In addition to presenting new experimental evidence in favour of the IH, this study reveals that the degree of optionality L2ers exhibit at the syntax-discourse interface is moderated by the following variables, which have not been (sufficiently) considered in previous work on the IH: (i) construction frequency (very rare construction → more optionality), (ii) the quantity and/or distance of the pieces of contextual information the speaker needs to process (many pieces of contextual information in an inter-sentential context → more optionality), (iii) the level of proficiency in the L2 (lower level of proficiency → more optionality), and (iv) the (dis)similarity between the L1 and the L2 (L1≠L2 → more optionality). The study which concentrates on the impact of explicit grammar instruction on L2 acquisition follows a pre-test, treatment, post-test and delayed post-test design and tests participants by means of speeded acceptability judgement tasks. This study shows that explicit grammar instruction results in durable gains for L2ers, but its effectiveness is moderated by two factors: (i) the type of linguistic domain(s) involved in the target structure and (ii) whether or not L2ers are developmentally ready to acquire the target structure. Regarding factor (i), research findings indicate that the area that has been found to be a locus of permanent optionality in L2 acquisition – the syntax-discourse interface – is much less permeable to instructional effects than “narrow” syntax. Regarding factor (ii), results suggest that explicit instruction only benefits acquisition when L2ers are developmentally ready to acquire the target property. As these findings are relevant not only to GenSLA theory, but also to L2 teaching, the thesis includes an analysis of the relevance and potential implications of its findings for L2 grammar teaching.A presente tese aborda dois tipos de interfaces que se tornaram recentemente áreas de interesse centrais na investigação desenvolvida em aquisição de língua segunda (L2) numa perspetiva generativista: (i) interfaces linguísticas – a interface sintaxe-discurso (o nosso foco principal de investigação) e a interface léxico-sintaxe na aquisição de L2 por adultos –, e (ii) uma interface interdisciplinar – a interface entre os domínios de aquisição e didática de L2. A tese pretende lançar nova luz sobre quatro questões que continuam a gerar muito debate no domínio de aquisição de L2: (i) Serão as propriedades “puramente” (léxico-)sintáticas completamente adquiríveis no estádio final de aquisição de L2, como a Hipótese de Interface (HI) (Sorace & Filiaci, 2006, Sorace, 2011b) propõe? (ii) Serão as propriedades na interface entre sintaxe e discurso necessariamente um locus de opcionalidade no estádio final de aquisição de L2, como a HI prediz? (iii) Quais são os papéis da influência da língua materna (L1), do input e de fatores de processamento na aquisição de L2 na interface sintaxe-discurso? (iv) Será que o ensino explícito ajuda os falantes de L2 a ultrapassarem problemas persistentes na aquisição de propriedades sintáticas e de sintático-discursivas? A fim de investigar estas questões, a tese debruça-se sobre um fenómeno linguístico ainda pouco investigado no domínio de aquisição de L2: a inversão sujeito-verbo (ISV) em inglês L2. Três tipos de ISV são considerados aqui: (i) a inversão “livre” (e sua correlação com sujeitos nulos), (ii) a inversão locativa e (iii) construções com there com verbos que não be (‘ser/estar’). A primeira é agramatical em inglês por um fator estritamente sintático: esta língua fixa o valor negativo para o parâmetro do sujeito nulo. Os dois últimos tipos de ISV, por seu lado, são possíveis em inglês em certas condições (léxico-)sintáticas e discursivas. A tese compreende dois estudos experimentais: (i) um estudo sobre a aquisição das propriedades lexicais, sintáticas e discursivas da ISV por falantes avançados e quase nativos de inglês que têm como L1 o francês (uma língua semelhante ao inglês nos aspetos relevantes) e o português europeu (uma língua diferente do inglês nos aspetos relevantes) e (ii) um estudo sobre o impacto do ensino explícito de gramática na aquisição de propriedades “estritamente” sintáticas e sintático-discursivas da ISV por falantes de português europeu com um nível intermédio e avançado em inglês L2. No primeiro estudo, os participantes são testados através de três tipos de tarefas: tarefas drag and drop não temporizadas, tarefas de priming sintático e tarefas de juízos de aceitabilidade rápidos. Em conjunto, os resultados destas tarefas confirmam que, como predito pela HI, as propriedades da ISV que são puramente (léxico-)sintáticas não são problemáticas no estádio final da aquisição de L2, mas aquelas que envolvem a interface entre sintaxe e discurso são um locus de opcionalidade permanente, mesmo quando a L1 é semelhante à L2. Os resultados são, além disso, consistentes com a proposta da HI de que a opcionalidade encontrada na interface sintaxe-discurso é causada (principalmente) por ineficiências de processamento associadas ao bilinguismo. Além de apresentar nova evidência experimental a favor da HI, este estudo mostra que o grau de opcionalidade que os falantes de L2 exibem na interface sintaxe-discurso é moderado pelas seguintes variáveis, que não têm sido (suficientemente) consideradas na literatura sobre a HI: (i) a frequência da construção na língua alvo (construção muito rara → mais opcionalidade), (ii) a quantidade e/ou distância das informações contextuais que o falante precisa processar (muitas informações contextuais no contexto inter-frásico → mais opcionalidade), (iii) o nível de proficiência na L2 (menor nível de proficiência → mais opcionalidade), e (iv) a (dis)semelhança entre a L1 e a L2 (L1 ≠ L2 → mais opcionalidade). O estudo de intervenção didática compreende um pré-teste e dois pós-testes após a intervenção e testa os participantes através de tarefas de juízos de aceitabilidade rápidos. Este estudo mostra que o ensino explícito da gramática pode resultar em ganhos duradouros para os aprendentes de L2, mas a sua eficácia é moderada por dois fatores: (i) o tipo de domínio(s) linguístico(s) em que propriedade alvo se situa e (ii) o grau de developmental readiness dos aprendentes para adquirirem a propriedade alvo. Em relação ao fator (i), os resultados deste estudo indicam que a área que constitui um locus de opcionalidade permanente na aquisição de L2 – a interface sintaxe-discurso – é muito menos permeável a efeitos de ensino do que a sintaxe “pura”. Em relação ao fator (ii), os resultados sugerem que o ensino explícito facilita a aquisição de L2 apenas quando os aprendentes atingiram um estádio de desenvolvimento em que já lhes é possível adquirir a propriedade alvo. Como estes resultados são relevantes não só para a teoria de aquisição de L2, mas também para o ensino de L2, a tese inclui uma análise da relevância e potenciais implicações dos seus resultados para o ensino da gramática em L2

    Pinpointing Software Inefficiencies With Profiling

    Get PDF
    Complex codebases with several layers of abstractions have abundant inefficiencies that affect the performance. These inefficiencies arise due to various causes such as developers\u27 inattention to performance, inappropriate choice of algorithms and inefficient code generation among others. To eliminate the redundancies, lots of work has been done during the compiling phase. However, not all redundancies can be easily detected or eliminated with compiler optimization passes due to aliasing, limited optimization scopes, and insensitivity to input and execution contexts act as severe deterrents to static program analysis. There are also profiling tools which can reveal how resources are used. However, they can hard to distinguish whether the resource is worth fully used. More profiling tools are in needed to diagnose resource wastage and pinpoint inefficiencies. We have developed three tools to pinpoint different types of inefficiencies in different granularity. We build Runtime Value Numbering (RVN), a dynamic fine-grained profiler to pinpoint and quantify redundant computations in an execution. It is based on the classical value numbering technique but works at runtime instead of compile-time. We developed RedSpy, a fine-grained profiler to pinpoint and quantify value redundancies in program executions. Value redundancy may happen overtime at the same locations or in adjacent locations, and thus it has temporal and spatial locality. RedSpy identifies both temporal and spatial value locality. Furthermore, RedSpy is capable of identifying values that are approximately the same, enabling optimization opportunities in HPC codes that often use floating-point computations. RVN and RedSpy are both instrumentation based tools. They provide comprehensive result while introducing high space and time overhead. Our lightweight framework, Witch, samples consecutive accesses to the same memory location by exploiting two ubiquitous hardware features: the performance monitoring units (PMU) and debug registers. Witch performs no instrumentation. Hence, witchcraft - tools built atop Witch - can detect a variety of software inefficiencies while introducing negligible slowdown and insignificant memory consumption and yet maintaining accuracy comparable to exhaustive instrumentation tools. Witch allowed us to scale our analysis to a large number of codebases. All the tools work on fully optimized binary executable and provide insightful optimization guidance by apportioning redundancies to their provenance - source lines and full calling contexts. We apply RVN, RedSpy, and Witch on programs that were optimization targets for decades and guided by the tools, we were able to eliminate redundancies that resulted in significant speedups

    Optimization of environmental control to fit living space requirements

    Get PDF
    Due to the character of the original source materials and the nature of batch digitization, quality control issues may be present in this document. Please report any quality issues you encounter to [email protected], referencing the URI of the item.Includes bibliographical references: leaves 39-40.This study examines the application of Environmental Control Systems (ECSs) for people with disabilities who live in the dormitory. ECSs allow people with disabilities to control appliances in their homes and parameters in their environment. The purpose of this study is to analyze effectiveness and efficiency of current ECSs when they are used in this setting, and formulate ways of reducing the cost and complexity of these systems. ECSs are compared against the needs of a real client with disabilities who lives in a dormitory room on the Texas A&M University main campus. General conclusions about the strengths and weaknesses of the systems, when applied to the small living space, are formulated. Alternative ECS designs are conceived and discussed. Finally, an example ECS component subsystem design is produced in detail

    The Motion Picture Industry: Critical Issues in Practice, Current Research, and New Research Directions

    Get PDF
    The motion picture industry has provided a fruitful research domain for scholars in marketing and other disciplines. The industry has high economic importance and is appealing to researchers because it offers both rich data that cover the entire product lifecycle for many new products and because it provides many unsolved “puzzles.” Although the amount of scholarly research in this area is rapidly growing, its impact on practice has not been as significant as in other industries (e.g., consumer packaged goods). In this article, we discuss critical practical issues for the motion picture industry, review existing knowledge on those issues, and outline promising research directions. Our review is organized around the three key stages in the value chain for theatrical motion pictures: production, distribution, and exhibition. Focusing on what we believe are critical managerial issues, we propose various conjectures—framed either as research challenges or specific research hypotheses—related to each stage in the value chain and often involved in understanding consumer movie-going behavior

    SlowCoach:Mutating Code to Simulate Performance Bugs

    Get PDF
    Performance bugs are unnecessarily inefficient code chunks in software codebases that cause prolonged execution times and degraded computational resource utilization. For performance bug diagnostics, tools that aid in the identification of said bugs, such as benchmarks and profilers, are commonly employed. However, due to factors such as insufficient workloads or ineffective benchmarks, software defects related to code inefficiencies are inherently difficult to diagnose. Hence, the capabilities of performance bug diagnostic tools are limited and performance bug instances may be missed. Traditional mutation testing (MT) is a technique for quantifying a test suite's ability to find functional bugs by mutating the code of the test subject. Similarly, we adopt performance mutation testing (PMT) to evaluate performance bug diagnostic tools and identify where improvements need to be made to a performance testing methodology. We carefully investigate the different performance bug fault models and how synthesized performance bugs based on these models can evaluate benchmarks and workload selection to help improve performance diagnostics. In this paper, we present the design of our PMT framework, SLOWCOACH, and evaluate it with over 1600 mutants from 4 real-world software projects

    Understanding Performance Inefficiencies In Native And Managed Languages

    Get PDF
    Production software packages have become increasingly complex with millions of lines of code, sophisticated control and data flow, and references to a hierarchy of external libraries. This complexity often introduces performance inefficiencies across software stacks, making it practically impossible for users to pinpoint them manually. Performance profiling tools (a.k.a. profilers) abound in the tools community to aid software developers in understanding program behavior. Classical profiling techniques focus on identifying hotspots. The hotspot analysis is indispensable; however, it can hardly diagnose whether a resource is being used in a productive manner that contributes to the overall efficiency of a program. Consequently, a significant burden is on developers to make a judgment call on whether there is scope to optimize a hotspot. Derived metrics, e.g., cache miss ratio, offer slightly better intuition into hotspots but are still not panaceas. Hence, there is a need for profilers that investigate resource wastage instead of usage. To overcome the critical missing pieces in prior work and complement existing profilers, we propose novel fine- and coarse-grained profilers to pinpoint varieties of performance inefficiencies and provide optimization guidance for a wide range of software covering benchmarks, enterprise applications, and large-scale parallel applications running on supercomputers and data centers. Fine-grained profilers are indispensable to understand performance inefficiencies comprehensively. We propose a whole-program profiler called LoadSpy, which works on binary executables to detect and quantify wasteful memory operations in their context and scope. Our observation, which is justified by myriad case studies, is that wasteful memory operations are often an indicator of various forms of performance inefficiencies, such as suboptimal choices of algorithms or data structures, missed compiler optimizations, and developers’ inattention to performance. Guided by LoadSpy, we are able to optimize a large number of well-known benchmarks and real-world applications, yielding significant speedups. Despite deep performance insights offered by fine-grained profilers, the high overhead keeps them away from widespread adoption, particularly in production. By contrast, coarse-grained profilers introduce low overhead at the cost of poor performance insights. Hence, another research topic is how we benefit from both, that is, the combination of deep insights of fine-grained profilers and low overhead of coarse-grained ones. The first effort to do so is proposing a lightweight profiler called JXPerf. It abandons heavyweight instrumentation by combining hardware performance monitoring units and debug registers available in commodity CPUs to detect wasteful memory operations. Compared with LoadSpy, JXPerf reduces the runtime overhead from 10x to 7% on average. The lightweight nature makes it useful in production. Another effort is proposing a lightweight profiler called FVSampler, the first nonintrusive profiler to study function execution variance
    corecore