361 research outputs found
Toward the next generation of research into small area effects on health : a synthesis of multilevel investigations published since July 1998.
To map out area effects on health research, this study had the following aims: (1) to inventory multilevel investigations of area effects on self rated health, cardiovascular diseases and risk factors, and mortality among adults; (2) to describe and critically discuss methodological approaches employed and results observed; and (3) to formulate selected recommendations for advancing the study of area effects on health. Overall, 86 studies were inventoried. Although several innovative methodological approaches and analytical designs were found, small areas are most often operationalised using administrative and statistical spatial units. Most studies used indicators of area socioeconomic status derived from censuses, and few provided information on the validity and reliability of measures of exposures. A consistent finding was that a significant portion of the variation in health is associated with area context independently of individual characteristics. Area effects on health, although significant in most studies, often depend on the health outcome studied, the measure of area exposure used, and the spatial scale at which associations are examined
Sensitivity Analysis of List Scheduling Heuristics
When jobs have to be processed on a set of identical parallel machines so as to minimize the makespan of the schedule, list scheduling rules form a popular class of heuristics. The order in which jobs appear on the list is assumed here to be determined by the relative size of their processing times; well known special cases are the LPT rule and the SPT rule, in which the jobs are ordered according to non-increasing and non-decreasing processing time respectively. When one of the job processing times is gradually increased, the schedule produced by a list scheduling rule will be affected in a manner reflecting its sensitivity to data perturbations. We analyze this phenomenon and obtain analytical support for the intuitively plausible notion that the sensitivity of a list scheduling rule increases with the quality of the schedule produced
Ultrasound transducer positioning aid for fetal heart rate monitoring
Fetal heart rate (fHR) monitoring is usually performed by Doppler ultrasound (US) techniques. For reliable fHR measurements it is required that the fetal heart is located within the US beam. In clinical practice, clinicians palpate the maternal abdomen to identify the fetal presentation and then the US transducer is fixated on the maternal abdomen where the best fHR signal can be obtained. Finding the optimal transducer position is done by listening to the strength of the Doppler audio output and relying on a signal quality indicator of the cardiotocographic (CTG) measurement system. Due to displacement of the US transducer or displacement of the fetal heart out of the US beam, the fHR signal may be lost. Therefore, it is often necessary that the obstetrician repeats the tedious procedure of US transducer positioning to avoid long periods of fHR signal loss. An intuitive US transducer positioning aid would be highly desirable to increase the work flow for the clinical staff. In this paper, the possibility to determine the fetal heart location with respect to the transducer by exploiting the received signal power in the transducer elements is shown. A commercially available US transducer used for fHR monitoring is connected to an US open platform, which allows individual driving of the elements and raw US data acquisition. Based on the power of the received Doppler signals in the transducer elements, the fetal heart location can be estimated. A beating fetal heart setup was designed and realized for validation. The experimental results show the feasibility of estimating the fetal heart location with the proposed method. This can be used to support clinicians in finding the optimal transducer position for fHR monitoring more easily
Discrete Convex Functions on Graphs and Their Algorithmic Applications
The present article is an exposition of a theory of discrete convex functions
on certain graph structures, developed by the author in recent years. This
theory is a spin-off of discrete convex analysis by Murota, and is motivated by
combinatorial dualities in multiflow problems and the complexity classification
of facility location problems on graphs. We outline the theory and algorithmic
applications in combinatorial optimization problems
Progressive tau aggregation does not alter functional brain network connectivity in seeded hTau.P301L mice
Progressive accumulation of hyperphosphorylated tau is a hallmark of various neurodegenerative disorders including Alzheimer's disease. However, to date, the functional effects of tau pathology on brain network connectivity remain poorly understood. To directly interrogate the impact of tau pathology on functional brain connectivity, we conducted a longitudinal experiment in which we monitored a fibril-seeded hTau.P301L mouse model using correlative whole-brain microscopy and resting-state functional MRI. Despite a progressive aggravation of tau pathology across the brain, the major resting-state networks appeared unaffected up to 15 weeks after seeding. Targeted analyses also showed that the connectivity of regions with high levels of hyperphosphorylated tau was comparable to that observed in controls. In line with the ostensible retention of connectivity, no behavioural changes were detected between seeded and control hTau.P301L mice as determined by three different paradigms. Our data indicate that seeded tau pathology, with accumulation of tau aggregates throughout different regions of the brain, does not alter functional connectivity or behaviour in this mouse model. Additional correlative functional studies on different mouse models should help determine whether this is a generalizable trait of tauopathies
An application of the Rasch model to reading comprehension measurement
An effective reading comprehension measurement demands robust psychometric tools that allow teachers and researchers to evaluate the educational practices and track changes in students’ performance. In this study, we illustrate how Rasch model can be used to attend such demands and improve reading comprehension measurement. We discuss the construction of two reading comprehension tests: TRC-n, with narrative texts, and TRC-e, with expository texts. Three vertically scaled forms were generated for each test (TRC-n-2, TRC-n-3, TRC-n-4; TRC-e-2, TRC-e-3 and TRC-e-4), each meant to assess Portuguese students in second, third and fourth grade of elementary school. The
tests were constructed according to a nonequivalent groups with anchor test design and data were analyzed using the Rasch model. The results provided evidence for good psychometric qualities for each test form,
including unidimensionality and local independence and adequate reliability. A critical view of this study and future researches are discussed.CIEC – Research Centre on Child Studies, IE, UMinho (FCT R&D unit 317), PortugalThis research was supported by Grant FCOMP-01-0124-FEDER-010733 from
Fundação para a Ciência e Tecnologia (FCT) and the European Regional
Development Fund (FEDER) through the European program COMPETE
(Operational Program for Competitiveness Factors) under the National
Strategic Reference Framework (QREN).info:eu-repo/semantics/publishedVersio
On the importance of sluggish state memory for learning long term dependency
The vanishing gradients problem inherent in Simple Recurrent Networks (SRN) trained with back-propagation, has led to a significant shift towards the use of Long Short-term Memory (LSTM) and Echo State Networks (ESN), which overcome this problem through either second order error-carousel schemes or different learning algorithms respectively. This paper re-opens the case for SRN-based approaches, by considering a variant, the Multi-recurrent Network (MRN). We show that memory units embedded within its architecture can ameliorate against the vanishing gradient problem, by providing variable sensitivity to recent and more historic information through layer- and self-recurrent links with varied weights, to form a so-called sluggish state-based memory. We demonstrate that an MRN, optimised with noise injection, is able to learn the long term dependency within a complex grammar induction task, significantly outperforming the SRN, NARX and ESN. Analysis of the internal representations of the networks, reveals that sluggish state-based representations of the MRN are best able to latch on to critical temporal dependencies spanning variable time delays, to maintain distinct and stable representations of all underlying grammar states. Surprisingly, the ESN was unable to fully learn the dependency problem, suggesting the major shift towards this class of models may be premature
A genetic algorithm for the one-dimensional cutting stock problem with setups
This paper investigates the one-dimensional cutting stock problem considering two conflicting objective functions: minimization of both the number of objects and the number of different cutting patterns used. A new heuristic method based on the concepts of genetic algorithms is proposed to solve the problem. This heuristic is empirically analyzed by solving randomly generated instances and also practical instances from a chemical-fiber company. The computational results show that the method is efficient and obtains positive results when compared to other methods from the literature. © 2014 Brazilian Operations Research Society
- …