20,272 research outputs found
Recommended from our members
Modeling interactive memex-like applications based on self-modifiable petri nets
This paper introduces an interactive Memex-like application using a self-modifiable Petri Net model – Self-modifiable Color Petri Net (SCPN). The Memex (“memory extender”) device proposed by Vannevar Bush in 1945 focused on the problems of “locating relevant information in the published records and recording how that information is intellectually connected.” The important features of Memex include associative indexing and retrieval. In this paper, the self-modifiable functions of SCPN are used to achieve trail recording and retrieval. A place in SCPN represents a website and an arc indicates the trail direction. Each time when a new website is visited, a place corresponding to this website will be added. After a trail is built, users can use it to retrieve the websites they have visited. Besides, useful user interactions are supported by SCPN to achieve Memex functions. The types of user interactions include: forward, backward, history, search, etc. A simulator has been built to demonstrate that the SCPN model can realize Memex functions. Petri net instances can be designed to model trail record, back, and forward operations using this simulator. Furthermore, a client-server based application system has been built. Using this system, a user can surf online and record his surfing history on the server according to different topics and share them with other users
Genetic Markers as Instrumental Variables
The use of genetic markers as instrumental variables (IV) is receiving increasing attention from epidemiologists, economists, statisticians and social scientists. This paper examines the conditions that need to be met for genetic variants to be used as instruments. Although these have been discussed in the epidemiological, medical and statistical literature, they have not been well-defined in the economics and social science literature. The increasing availability of biomedical data however, makes understanding of these conditions crucial to the successful use of genotypes as instruments for modifiable risk factors. We combine the econometric IV literature with that from genetic epidemiology using a potential outcomes framework and review the IV conditions in the context of a social science application, examining the effect of child fat mass on academic performance.ALSPAC; Fat mass; Genetic Variants; Instrumental Variables; Mendelian Randomization; Potential Outcomes
The development of social class sensitive proxies for infant mortality at the PCT level: An appraisal of candiate indicators for the commission for health improvement
The main aim of the work is to identify social class-sensitive proxies for infant mortality at Primary Care Trust level that could be used in the CHI performance ratings process for PCTs in 2003/4
Matching Dependencies with Arbitrary Attribute Values: Semantics, Query Answering and Integrity Constraints
Matching dependencies (MDs) were introduced to specify the identification or
matching of certain attribute values in pairs of database tuples when some
similarity conditions are satisfied. Their enforcement can be seen as a natural
generalization of entity resolution. In what we call the "pure case" of MDs,
any value from the underlying data domain can be used for the value in common
that does the matching. We investigate the semantics and properties of data
cleaning through the enforcement of matching dependencies for the pure case. We
characterize the intended clean instances and also the "clean answers" to
queries as those that are invariant under the cleaning process. The complexity
of computing clean instances and clean answers to queries is investigated.
Tractable and intractable cases depending on the MDs and queries are
identified. Finally, we establish connections with database "repairs" under
integrity constraints.Comment: 13 pages, double column, 2 figure
Putting Iterative Proportional Fitting on the researcher’s desk
‘Iterative Proportional Fitting’ (IPF) is a mathematical procedure originally developed to combine the information from two or more datasets. IPF is a well-established technique with the theoretical and practical considerations behind the method thoroughly explored and reported.
In this paper the theory of IPF is investigated with a mathematical definition of the procedure and a review of the relevant literature given. So that IPF can be readily accessible to researchers the procedure has been automated in Visual Basic and a description of the program and a ‘User Guide’ are provided.
IPF is employed in various disciplines but has been particularly useful in census-related analysis to provide updated population statistics and to estimate individual-level attribute characteristics. To illustrate the practical application of IPF various case studies are described. In the future, demand for individual-level data is thought likely to increase and it is believed that the IPF procedure and Visual Basic program have the potential to facilitate research in geography and other disciplines
Targeting brain, body and heart for cognitive health and dementia prevention
This report looks into the current research regarding dementia and Alzheimer\u27s disease prevention and offers ideas for possible future solutions. Prevention of dementia is the ultimate aim of a large, albeit under resourced, international research effort. The success of this effort would have enormous benefits for millions of people and save billions of dollars in health care costs. Conversely, the status quo will see the number of Australians living with dementia soar in coming years. Many more people will experience and seek help for mild cognitive impairment. There are many different forms of dementia, a syndrome caused by brain disease and characterised by declining cognitive function that impairs daily activities.
Dementia can affect memory, language, attention, judgement, planning, behaviour, mood and personality. Mild cognitive impairment does not significantly impair daily activities, but often represents an earlier stage of cognitive decline. There is no cure for the common forms of cognitive decline and dementia, including the most common, Alzheimer’s disease. A cure may only be achieved by prevention, because the diseases that cause dementia begin many years before symptoms become apparent and gradually damage the brain until it can no longer function normally. Intervening early to stop or slow disease progression, before cognitive impairment emerges, offers the best hope of preventing dementia.
Is this achievable? It requires breakthroughs in early detection and intervention. New diagnostic technologies have been developed that can detect the presence of abnormal protein accumulations in the brain that characterise Alzheimer’s disease. The disease can now be detected by brain scans or cerebrospinal fluid tests in the preclinical stage, before any cognitive changes occur
Programmable Trigger Logic Unit Based on FPGA Technology
A programmable trigger logic module (TRILOMO) was implemented successfully in
an FPGA using their internal look-up tables to save Boolean functions. Up to 16
trigger input signals can be combined logically for a fast trigger decision.
The new feature is that the trigger decision is VME register based. The changes
are made without modifying the FPGA code. Additionally the module has an
excellent signal delay adjustment.Comment: 4 pages, 4 figure
- …