3,481 research outputs found
Recommended from our members
Perseverers, recencies and deferrers : new experimental evidence for multiple inference strategies in understanding
In the course of understanding a text, a succession of decision points arise at which readers are faced with the task of choosing among alternative possible interpretations ofthattext. We present new experimental evidence that different readers use different inference strategies to guide their inference behavior during understanding. The choices available to an understander range from various alternative inferential paths to the option of making no inference at a particular point, leaving a 'loose end'. Different inference strategies result in observably different behaviors during understanding, including consistent differences in reading times, and different interpretations of a text. The preliminary experimental results given here so far consistently support a previously published set of hypotheses about the inference process that we have called Judgmental Inference theory
Having Your Cake and Eating It Too: Autonomy and Interaction in a Model of Sentence Processing
Is the human language understander a collection of modular processes
operating with relative autonomy, or is it a single integrated process? This
ongoing debate has polarized the language processing community, with two
fundamentally different types of model posited, and with each camp concluding
that the other is wrong. One camp puts forth a model with separate processors
and distinct knowledge sources to explain one body of data, and the other
proposes a model with a single processor and a homogeneous, monolithic
knowledge source to explain the other body of data. In this paper we argue that
a hybrid approach which combines a unified processor with separate knowledge
sources provides an explanation of both bodies of data, and we demonstrate the
feasibility of this approach with the computational model called COMPERE. We
believe that this approach brings the language processing community
significantly closer to offering human-like language processing systems.Comment: 7 pages, uses aaai.sty macr
Recommended from our members
Parsing with parallelism : a spreading-activation model of inference processing during text understanding
The past decade of reseatch in Natural Language Processing has universally recognized that, since natural language input is almost always ambiguous with respect to its pragmatic implications, its syntactic parse, and even its lexical analysis (i.e., choice of correct word-sense for an ambiguous word), processing natural language input requires decisions about word meanings, syntactic structure, and pragmatic inferences. The lexical, syntactic, and pragmatic levels of inferencing are not as disparate as they have often been treated in both psychological and artificial intelligence research. In fact, these three levels of analysis interact to form a joint interpretation of text.ATLAST (A Three-level Language Analysis SysTem) is an implemented integration of human language understanding at the lexical, the syntactic, and the pragmatic levels. For psychological validity, ATLAST is based on results of experiments with human subjects. The ATLAST model uses a new architecture which was developed to incorporate three features: spreading activation memory, two-stage syntax, and parallel processing of syntax and semantics. It is also a new framework within which to interpret and tackle unsolved problems through implementation and experimentation
Recommended from our members
STRATEGIST : a program that models strategy-driven and content-driven inference behavior
In the course of understanding a text, different readers use different inference strategies to guide their choice of interpretations of the events in the text. This is in contrast to previous computer models of understanding, which all use the content-driven inference. The separate strategies are theorized to be composed of the same component inference processes, but of different rules for application of the processes. The use of different strategies occasionally results in different results of new experimental data and a working computer program, called STRATEGIST, that models both strategy-drive and content-driven inference behavior. The rules which make up two of these strategies are presented
The Impact of Differential Cost Sharing of Non-Steroidal Anti-Inflammatory Agents on the Use and Costs of Analgesic Drugs
OBJECTIVE: To estimate the effect of differential cost sharing (DCS) schemes for non-steroidal anti-inflammatory drugs (NSAIDs) on drug subsidy program and beneficiary expenditures. DATA SOURCES/STUDY SETTING: Monthly aggregate claims data from Pharmacare, the public drug subsidy program for seniors in British Columbia, Canada over the period 1989-11 to 2001-06. STUDY DESIGN: DCS limits insurance reimbursement of a group of therapeutically similar drugs to the cost of the lowest priced drugs, with beneficiaries responsible for costs above the reimbursement limit. Pharmacare introduced two different forms of DCS, generic substitution (GS) and reference pricing (RP), in April 1994 and November 1995, respectively, to the NSAIDs. Under GS, generic and brand versions of the same NSAID are considered interchangeable, whereas under RP different NSAIDs are. We extrapolated average reimbursement per day of NSAID therapy over the months before GS and RP to estimate what expenditures would have been without the policies. These counterfactual predictions were compared to actual values to estimate the impact of the policies; the estimated impacts on reimbursement rates were multiplied by the post-policy volume of NSAIDS dispensed, which appeared unaffected by the policies, to estimate expenditure changes. DATA COLLECTION: The cleaned NSAID claims data, obtained from Pharmacare’s databases, were aggregated by month and by their reimbursement status under the GS and RP policies. PRINCIPAL FINDINGS: After RP, program expenditures declined by 4 million annually, cutting expenditure by half. Most savings accrued from the substitution of low cost NSAIDs for more costly alternatives. About 20% of savings represented expenditures by seniors who elected to pay for partially-reimbursed drugs. GS produced one quarter the savings of RP. CONCLUSIONS: RP of NSAIDs achieved its goal of reducing drug expenditures and was more effective than GS. The effects of RP on patient health and associated health care costs remain to be investigated.Reference pricing; generic substitution; prescription drugs; drug cost containment; NSAIDs.
The structure of preserved information in quantum processes
We introduce a general operational characterization of information-preserving
structures (IPS) -- encompassing noiseless subsystems, decoherence-free
subspaces, pointer bases, and error-correcting codes -- by demonstrating that
they are isometric to fixed points of unital quantum processes. Using this, we
show that every IPS is a matrix algebra. We further establish a structure
theorem for the fixed states and observables of an arbitrary process, which
unifies the Schrodinger and Heisenberg pictures, places restrictions on
physically allowed kinds of information, and provides an efficient algorithm
for finding all noiseless and unitarily noiseless subsystems of the process
Recommended from our members
Interaction Effects Between Word-Level and Text-Level Inferences: On-Line Processing of Ambiguous Words in Context
Recommended from our members
Opportunities and challenges of new product development and testing for longevity in clothing
Many types of clothing are now seen as disposable by consumers in the UK even though durability is among the top criteria that consumers claim to use when buying garments (WRAP, 2012). Routine tests for clothing performance carried out by retailers are generally designed to ensure garments are 'fit for purpose', not to establish durability or longevity. Designing clothing that lasts longer is, however, key to reducing waste and has become a government policy objective (Defra, 2011).
This paper discusses the findings from a recent research project, carried out for WRAP (Waste and Resources Action Programme), that investigated the opportunities for measuring, specifying and communicating aspects of clothing longevity within a Longevity Protocol. The Protocol is intended to enable retailers to obtain a reliable indication of garment life expectancy and was piloted in conjunction with clothing industry practitioners. It incorporates recommendations for best practice in product development and a testing regime that provides an indication of garment life expectancy (WRAP, 2014).
Overall, the findings from the pilot suggest that it is possible to test for garment longevity, however, this
process can be drawn-out and may not fit easily into the normal product development process. Furthermore, variations in consumer wearing patterns and laundering make it difficult for retailers to guarantee and communicate product lifetimes in absolute terms.
The research adds to a growing body of evidence that supports the concept of design for clothing longevity. The findings will help to inform strategies for the implementation of government policy on sustainable clothing, but point to the need for refined testing processes to support this agenda
- …