289 research outputs found

    Are Rules a Thing of the Past? The Acquisition of Verbal Morphology by an Attractor Network

    Get PDF
    This paper investigates the ability of a connectionist attractor network to learn a system analogous to part of the system of English verbal morphology. The model learned to produce phonological representations of stems and inflected forms in response to semantic inputs. The model was able to resolve several outstanding problems. It displayed all three stages of the characteristic U-shaped pattern of acquisition of the English past tense (early correct performance, a period of overgeneralizations and other errors, and eventual mastery). The network is also able to simulate direct access (the ability to create an inflected form directly from a semantic representation without having to first access an intermediate base form). The model was easily able to resolve homophonic verbs (such as ring and wring). In addition, the network was able to apply the past tense, third person -s and progressive -ing suffixes productively to novel forms and to display sensitivity to the subregularities that mark families of irregular past tense forms. The network also simulates the frequency by regularity interaction that has been found in reaction time studies of human subjects and provides a possible explanation for some hypothesized universal constraints upon morphological operations

    A Computational Study: The Effect of Hypersonic Plasma Sheaths on Radar Cross Section for Over the Horizon Radar

    Get PDF
    In this study radar cross sections were calculated for an axial symmetric 6-degree half angle blunted cone with a nose radius of 2.5 cm and length of 3.5 m including and excluding the effects of an atmospheric hypersonic plasma sheath for altitudes of 40 km, 60 km and 80 km and speeds of 5 km/s, 6 km/s and 7 km/s. LAURA, was used to determine the plasma characteristics for the hypersonic flight conditions using a 11-species 2-temperature chemical model. Runs were accomplished first with a super-catalytic surface boundary condition without a turbulence model and then for some cases with an non-reactive surface boundary condition where a mentor-SST turbulence model was used. The resulting plasmas heath properties were used in a Finite Difference Time Domain code to calculate the cones radar cross section both with and without the effects of the plasma sheath. The largest increase in radar cross section (RCS) was found for the 60 km7km/s case with an increase of 3.84%. A possible small decrease in RCS was found for the 40 km altitude 5 km/s and 80km 7 km/s cases on the order of 0.1%

    Rules Are Made to Be Broken: How the Process of Expedited Removal Fails Asylum Seekers

    Get PDF
    Immigration inspectors are authorized to deport persons who arrive at U.S. ports without valid travel documents. This process, which usually occurs within 48 hours and does not allow for judicial review, is called expedited removal. This article begins by summarizing the findings of the few studies allowed access to the process. The authors extrapolate from the studies to demonstrate that thousands of genuine asylum seekers have erroneously been deported via expedited removal. The greatest cause of erroneous deportation is a failure by the agency responsible for the process, Customs and Border Protection (CBP), to follow its own rules. The heart of the article is a simple inquiry: given the stakes involved, why doesn’t CBP follow its own rules? One report found CBP’s failure “simply inexplicable.” Drawing on the work of Jerry Mashaw, among others, the article attempts to “explain the inexplicable.” It demonstrates that a mix of bureaucratic and personal realities, including CBP’s dominant enforcement culture, combine to promote noncompliance with many of the rules intended to protect asylees. This showing has important implications for those who would repair the system. CBP’s enforcement culture is likely to defeat any attempt to ensure compliance with the rules simply by reiterating them. A method is needed for moderating the culture, so that deporting the wrong person becomes as unacceptable in the future as admitting the wrong person is right now. The article closes with numerous suggestions to help achieve the required change, as well as a few recommendations for specific rule changes

    Visualizing Domain Coherence: Social Informatics as a Case Study

    Get PDF
    Using bibliometric methods of inquiry, one of the eleven approaches of domain-analytic research offered by HjØrland (2002), we were able to visualize the emerging field of social informatics (SI). Past research has demonstrated the breadth and depth of biblometric tools by investigating a variety of research communities (White and Griffith 1981; Tsay 1989; McCain 1991; Borgman and Rice 1992; White and McCain 1998; Smiraglia 2006, 2009, Moore 2007; Jank 2010). Using the published literature produced in SI from 1997 through 2009, allowed visualization of domain-coherence in SI. Concepts that were utilized by which to measure domain coherence include the number of ideas espoused (Collins 1998, 42), scholarly productivity (Crane 1972), and the number of scholars participating (Price 1986; Collins 1998). In this lightning paper based on a recent dissertation (Hoeffner 2012) we will present a visualization based on the analysis of social informatics’ literature, showing growth in publication productivity, evidence of intellectually and socially connected scholars, reliance on scholars from within the fields of information science, and computer science, and two or three topical areas of interest that pertained to communication and aspects of computer mediation, as well as policy and access. Discourse among scholars was evident, and although the Journal of the American Society for Information Science & Technology overwhelming published the largest number of SI work, there was representation in a few key journals. Author co-citation patterns revealed a core group of scholars commonly cited together, and an investigation of self-citation practices revealed slightly less evidence among SI’s most prolific authors than in science overall. Recently, Smiraglia (2012) defined a domain as “a domain is a group with an ontological base that reveals an underlying teleology, a set of common hypotheses, epistemological consensus on methodological approaches, and social semantics.” Our visualization demonstrates continued coherence of SI as a domain

    Review of Graph Comprehension Research: Implications for Instruction

    Full text link
    Graphs are commonly used in textbooks and educational software, and can help students understand science and social science data. However, students sometimes have difficulty comprehending information depicted in graphs. What makes a graph better or worse at communicating relevant quantitative information? How can students learn to interpret graphs more effectively? This article reviews the cognitive literature on how viewers comprehend graphs and the factors that influence viewers' interpretations. Three major factors are considered: the visual characteristics of a graph (e.g., format, animation, color, use of legend, size, etc.), a viewer's knowledge about graphs, and a viewer's knowledge and expectations about the content of the data in a graph. This article provides a set of guidelines for the presentation of graphs to students and considers the implications of graph comprehension research for the teaching of graphical literacy skills. Finally, this article discusses unresolved questions and directions for future research relevant to data presentation and the teaching of graphical literacy skills.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/44452/1/10648_2004_Article_363437.pd

    Perfusion computed tomography relative threshold values in definition of acute stroke lesions

    Get PDF
    BACKGROUND: Perfusion computed tomography (CT) is a relatively new technique that allows fast evaluation of cerebral hemodynamics by providing perfusion maps and gives confirmation of perfusion deficits in ischemic areas. Some controversies exist regarding accuracy of quantitative detection of tissue viability: penumbra (tissue at risk) or core (necrosis). PURPOSE: To define brain tissue viability grade on the basis of the perfusion CT parameters in acute stroke patients. MATERIAL AND METHODS: A multimodal CT imaging protocol; unenhanced CT of the brain, CT angiography of head and neck blood vessels, followed by brain perfusion CT and 24 h follow-up brain CT was performed. Perfusion deficits were detected first visually, with subsequent manual quantitative and relative measurements in affected and contra-lateral hemisphere in 87 acute stroke patients. RESULTS: Visual perfusion deficit on perfusion CT images was found in 78 cases (38 women, 40 men; mean age, 30-84 years). Penumbra lesions (n = 49) and core lesions (n = 42) were detected by increased mean transit time (MTT) on perfusion CT maps in comparison to contra-lateral hemispheres. Cerebral blood volume (CBV) mean values in the penumbra group were increased in the penumbra group and decreased in the core group. Cerebral blood flow (CBF) values were decreased in penumbra and markedly decreased in core lesion. CONCLUSION: Perfusion CT measurements are reliable in estimation of penumbra and core lesions in acute stroke patients, if relative threshold values are used. The most accurate parameter of hypoperfusion is increased MTT above 190%. Relative threshold values for irreversible lesion are CBFpublishersversionPeer reviewe

    Spatiotemporal modeling of microbial metabolism

    Get PDF
    Background Microbial systems in which the extracellular environment varies both spatially and temporally are very common in nature and in engineering applications. While the use of genome-scale metabolic reconstructions for steady-state flux balance analysis (FBA) and extensions for dynamic FBA are common, the development of spatiotemporal metabolic models has received little attention. Results We present a general methodology for spatiotemporal metabolic modeling based on combining genome-scale reconstructions with fundamental transport equations that govern the relevant convective and/or diffusional processes in time and spatially varying environments. Our solution procedure involves spatial discretization of the partial differential equation model followed by numerical integration of the resulting system of ordinary differential equations with embedded linear programs using DFBAlab, a MATLAB code that performs reliable and efficient dynamic FBA simulations. We demonstrate our methodology by solving spatiotemporal metabolic models for two systems of considerable practical interest: (1) a bubble column reactor with the syngas fermenting bacterium Clostridium ljungdahlii; and (2) a chronic wound biofilm with the human pathogen Pseudomonas aeruginosa. Despite the complexity of the discretized models which consist of 900 ODEs/600 LPs and 250 ODEs/250 LPs, respectively, we show that the proposed computational framework allows efficient and robust model solution. Conclusions Our study establishes a new paradigm for formulating and solving genome-scale metabolic models with both time and spatial variations and has wide applicability to natural and engineered microbial systems

    Characterization, renovation, and utilization of water from slurry transport systems

    Get PDF
    The transportation of a number of commodities as water slurries in pipelines offers a number of advantages which will make this method of transport more popular in coming years. Among the formeost of these advantages are high reliability, low operating costs, minimum environmental disruption, and ability to operate with nonpetroleum energy resources. Although coal is the most frequently mentioned material that is a candidate for slurry transport, other commodities including minerals, wood chips, and even solid refuse may be moved in this manner. Water used as a slurry transport medium must be properly characterized, renovated, and used in order to make slurry transport environmentally and economically acceptable.Project # B-145-MO Agreement # 14-34-0001-121

    CHARACTERIZATION OF PLUTONIUM CONTAMINATED SOILS FROM THE NEVADA TEST SITE IN SUPPORT OF EVALUATION OF REMEDIATION TECHNOLOGIES

    Get PDF
    ABSTRACT The removal of plutonium from Nevada Test Site (NTS) area soils has previously been attempted using various combinations of attrition scrubbing, size classification, gravitybased separation, flotation, air flotation, segmented gate, bioremediation, magnetic separation and vitrification. Results were less than encouraging, but the processes were not fully optimized. To support additional vendor treatability studies soil from the Clean Slate II site (located on the Tonopah Test Range, north of the NTS) were characterized and tested. These particular soils from the NTS are contaminated primarily with plutonium-239/240 and Am-241. Soils were characterized for Pu-239/240, Am-241 and gross alpha. In addition, wet sieving and the subsequent characterization were performed on soils before and after attrition scrubbing to determine the particle size distribution and the distribution of Pu-239/240 and gross alpha as a function of particle size. Sequential extraction was performed on untreated soil to provide information about how tightly bound the plutonium was to the soil. Magnetic separation was performed to determine if this could be useful as part of a treatment approach. The results indicate that about a 40% volume reduction of contaminated soil should be achievable by removing the >300 um size fraction of the soil. Attrition scrubbing does not effect particle size distribution, but does result in a slight shift of plutonium distribution to the fines. As such, attrition scrubbing may be able to slightly increase the ability to separate plutonium-contaminated particles from clean soil. This could add another 5-10% to the mass of the clean soil, bringing the total clean soil to 45-50%. Additional testing would be needed to determine the value of using attrition scrubbing as well as screening the soil through a sieve size slightly smaller than 300 um. Since only attrition scrubbing and wet sieving would be needed to attain this, it would be good to conduct this investigation. Magnetic separation did not work well. The sequential extraction studies indicated that a significant amount of plutonium was soluble in the "organic" and "resistant" extracts. As such chemical extraction based on these or similar extractants should also be considered as a possible treatment approach. WM '03 Conference, February 23-27, 2003 , Tucson, AZ 2 INTRODUCTION The removal of plutonium from Nevada Test Site (NTS) area soils has previously been attempted using various combinations of attrition, scrubbing, size classification, gravitybased separation, flotation, air flotation, segmented gate, bioremediation, magnetic separation, and vitrification (1). Results were less than encouraging, but the processes were not fully optimized. There is an opportunity for significant improvement through the utilization of more in depth studies
    corecore