9,859 research outputs found
Sperry Univac speech communications technology
Technology and systems for effective verbal communication with computers were developed. A continuous speech recognition system for verbal input, a word spotting system to locate key words in conversational speech, prosodic tools to aid speech analysis, and a prerecorded voice response system for speech output are described
Semantic Complexity In Treatment Of Naming Deficits In Aphasia: Evidence From Well-Defined Categories
Purpose: Our previous work on manipulating typicality of category exemplars during treatment of naming deficits has shown that training atypical examples generalizes to untrained typical examples but not vice versa. In contrast to natural categories that consist of fuzzy boundaries, well-defined categories (e.g., shapes) have rigid category boundaries. Whether these categories illustrate typicality effects similar to natural categories is under debate. The present study addressed this question in the context of treatment for naming deficits in aphasia. Methods: Using a single-subject experiment design, 3 participants with aphasia received a, semantic feature treatment to improve naming of either typical or atypical items of shapes, while generalization was tested to untrained items of the category. Results: For 2 of the 3 participants, training naming of atypical examples of shapes resulted in improved naming of untrained typical examples. Training typical examples in 1 participant did not improve naming of atypical examples. All 3 participants, however, showed weak acquisition trends. Conclusions: Results of the present study show equivocal support for manipulating typicality as a treatment variable within well-defined categories. Instead, these results indicate that acquisition and generalization effects within well-defined categories such as shapes are overshadowed by their inherent abstractness.Communication Sciences and Disorder
Improving DoD Energy Efficiency: Combining MMOWGLI Social-Media Brainstorming With Lexical Link Analysis (LLA) to Strengthen the Defense Acquisition Process
Disclaimer: The views represented in this report are those of the authors and do not reflect the official policy
position of the Navy, the Department of Defense, or the federal government.Excerpt from the Proceedings of the Tenth Annual Acquisition Research Symposium Logistics ManagementThe research presented in this report was supported by the Acquisition Research Program of the Graduate School of Business & Public Policy at the Naval Postgraduate School. To request defense acquisition research, to become a research sponsor, or to print additional copies of reports, please contact any of the staff listed on the Acquisition Research Program website (www.acquisitionresearch.net).Prepared for the Naval Postgraduate School, Monterey, CA 93943.Approved for public release; distribution is unlimited
An Efficient Implementation of the Head-Corner Parser
This paper describes an efficient and robust implementation of a
bi-directional, head-driven parser for constraint-based grammars. This parser
is developed for the OVIS system: a Dutch spoken dialogue system in which
information about public transport can be obtained by telephone.
After a review of the motivation for head-driven parsing strategies, and
head-corner parsing in particular, a non-deterministic version of the
head-corner parser is presented. A memoization technique is applied to obtain a
fast parser. A goal-weakening technique is introduced which greatly improves
average case efficiency, both in terms of speed and space requirements.
I argue in favor of such a memoization strategy with goal-weakening in
comparison with ordinary chart-parsers because such a strategy can be applied
selectively and therefore enormously reduces the space requirements of the
parser, while no practical loss in time-efficiency is observed. On the
contrary, experiments are described in which head-corner and left-corner
parsers implemented with selective memoization and goal weakening outperform
`standard' chart parsers. The experiments include the grammar of the OVIS
system and the Alvey NL Tools grammar.
Head-corner parsing is a mix of bottom-up and top-down processing. Certain
approaches towards robust parsing require purely bottom-up processing.
Therefore, it seems that head-corner parsing is unsuitable for such robust
parsing techniques. However, it is shown how underspecification (which arises
very naturally in a logic programming environment) can be used in the
head-corner parser to allow such robust parsing techniques. A particular robust
parsing model is described which is implemented in OVIS.Comment: 31 pages, uses cl.st
Proceedings of the 3rd Workshop on Domain-Specific Language Design and Implementation (DSLDI 2015)
The goal of the DSLDI workshop is to bring together researchers and
practitioners interested in sharing ideas on how DSLs should be designed,
implemented, supported by tools, and applied in realistic application contexts.
We are both interested in discovering how already known domains such as graph
processing or machine learning can be best supported by DSLs, but also in
exploring new domains that could be targeted by DSLs. More generally, we are
interested in building a community that can drive forward the development of
modern DSLs. These informal post-proceedings contain the submitted talk
abstracts to the 3rd DSLDI workshop (DSLDI'15), and a summary of the panel
discussion on Language Composition
- …