9,093 research outputs found
Data reduction in the ITMS system through a data acquisition model with self-adaptive sampling rate
Long pulse or steady state operation of fusion experiments require data acquisition and processing systems that reduce the volume of data involved. The availability of self-adaptive sampling rate systems and the use of real-time lossless data compression techniques can help solve these problems. The former is important for continuous adaptation of sampling frequency for experimental requirements. The latter allows the maintenance of continuous digitization under limited memory conditions. This can be achieved by permanent transmission of compressed data to other systems. The compacted transfer ensures the use of minimum bandwidth. This paper presents an implementation based on intelligent test and measurement system (ITMS), a data acquisition system architecture with multiprocessing capabilities that permits it to adapt the system’s sampling frequency throughout the experiment. The sampling rate can be controlled depending on the experiment’s specific requirements by using an external dc voltage signal or by defining user events through software. The system takes advantage of the high processing capabilities of the ITMS platform to implement a data reduction mechanism based in lossless data compression algorithms which are themselves based in periodic deltas
Web based system architecture for long pulse remote experimentation
Remote experimentation (RE) methods will be essential in next generation fusion devices. Requirements for long pulse RE will be: on-line data visualization, on-line data acquisition processes monitoring and on-line data acquisition systems interactions (start, stop or set-up modifications). Note that these methods are not oriented to real-time control of fusion plant devices.
INDRA Sistemas S.A., CIEMAT (Centro de Investigaciones Energéticas Medioambientales y Tecnológicas) and UPM (Universidad Politécnica de Madrid) have designed a specific software architecture for these purposes. The architecture can be supported on the BeansNet platform, whose integration with an application server provides an adequate solution to the requirements. BeansNet is a JINI based framework developed by INDRA, which makes easy the implementation of a remote experimentation model based on a Service Oriented Architecture. The new software architecture has been designed on the basis of the experience acquired in the development of an upgrade of the TJ-II remote experimentation system
A versatile trigger and synchronization module with IEEE1588 capabilities and EPICS support.
Event timing and synchronization are two key aspects to improve in the implementation of distributed data acquisition (dDAQ) systems such as the ones used in fusion experiments. It is also of great importance the integration of dDAQ in control and measurement networks. This paper analyzes the applicability of the IEEE1588 and EPICS standards to solve these problems, and presents a hardware module implementation based in both of them that allow adding these functionalities to any DAQ. The IEEE1588 standard facilitates the integration of event timing and synchronization mechanisms in distributed data acquisition systems based on IEEE 803.3 (Ethernet). An optimal implementation of such system requires the use of network interface devices which include specific hardware resources devoted to the IEE1588 functionalities. Unfortunately, this is not the approach followed in most of the large number of applications available nowadays. Therefore, most solutions are based in software and use standard hardware network interfaces. This paper presents the development of a hardware module (GI2E) with IEEE1588 capabilities which includes USB, RS232, RS485 and CAN interfaces. This permits to integrate any DAQ element that uses these interfaces in dDAQ systems in an efficient and simple way. The module has been developed with Motorola's Coldfire MCF5234 processor and National Semiconductors's PHY DP83640T, providing it with the possibility to implement the PTP protocol of IEEE1588 by hardware, and therefore increasing its performance over other implementations based in software. To facilitate the integration of the dDAQ system in control and measurement networks the module includes a basic Input/Output Controller (IOC) functionality of the Experimental Physics and Industrial Control System (EPICS) architecture. The paper discusses the implementation details of this module and presents its applications in advanced dDAQ applications in the fusion community
Event Recognition Using Signal Spectrograms in Long Pulse Experiments
As discharge duration increases, real-time complex analysis of the signal becomes more important. In this context, data acquisition and processing systems must provide models for designing experiments which use event oriented plasma control. One example of advanced data analysis is signal classification. The off-line statistical analysis of a large number of discharges provides information to develop algorithms for the determination of the plasma parameters from measurements of magnetohydrodinamic waves, for example, to detect density fluctuations induced by the Alfvén cascades using morphological patterns. The need to apply different algorithms to the signals and to address different processing algorithms using the previous results necessitates the use of an event-based experiment. The Intelligent Test and Measurement System platform is an example of architecture designed to implement distributed data acquisition and real-time processing systems. The processing algorithm sequence is modeled using an event-based paradigm. The adaptive capacity of this model is based on the logic defined by the use of state machines in SCXML. The Intelligent Test and Measurement System platform mixes a local multiprocessing model with a distributed deployment of services based on Jini
The unity and diversity of executive functions: A systematic review and re-analysis of latent variable studies.
Confirmatory factor analysis (CFA) has been frequently applied to executive function measurement since first used to identify a three-factor model of inhibition, updating, and shifting; however, subsequent CFAs have supported inconsistent models across the life span, ranging from unidimensional to nested-factor models (i.e., bifactor without inhibition). This systematic review summarized CFAs on performance-based tests of executive functions and reanalyzed summary data to identify best-fitting models. Eligible CFAs involved 46 samples (N = 9,756). The most frequently accepted models varied by age (i.e., preschool = one/two-factor; school-age = three-factor; adolescent/adult = three/nested-factor; older adult = two/three-factor), and most often included updating/working memory, inhibition, and shifting factors. A bootstrap reanalysis simulated 5,000 samples from 21 correlation matrices (11 child/adolescent; 10 adult) from studies including the three most common factors, fitting seven competing models. Model results were summarized as the mean percent accepted (i.e., average rate at which models converged and met fit thresholds: CFI ≥ .90/RMSEA ≤ .08) and mean percent selected (i.e., average rate at which a model showed superior fit to other models: ΔCFI ≥ .005/.010/ΔRMSEA ≤ -.010/-.015). No model consistently converged and met fit criteria in all samples. Among adult samples, the nested-factor was accepted (41-42%) and selected (8-30%) most often. Among child/adolescent samples, the unidimensional model was accepted (32-36%) and selected (21-53%) most often, with some support for two-factor models without a differentiated shifting factor. Results show some evidence for greater unidimensionality of executive function among child/adolescent samples and both unity and diversity among adult samples. However, low rates of model acceptance/selection suggest possible bias toward the publication of well-fitting but potentially nonreplicable models with underpowered samples. (PsycINFO Database Record (c) 2018 APA, all rights reserved)
Problem solving strategy in the teaching and learning processes of quantitative reasoning
The study presents an analysis of Polya's problem-solving strategy used in the training
processes of quantitative reasoning competence in students of the Universidad SimĂłn BolĂvar,
San José de Cúcuta, Colombia. The research was based on a descriptive design and had an
intentional sample of 58 students who were studying the sciences and general competencies
elective. For the collection of information, a diagnostic test (pre-test) and a final test (post-test)
were applied, in order to check the incidence of the applied strategy. The results showed a
significant improvement in the final results obtained by the students in each of the processes
formed: interpretation, representation and modeling, and argumentation
- …