1,673,891 research outputs found
M-1 engine test complex data acquisition systems
Instrumentation and data acquisition system for development testing of liquid hydrogen/liquid oxygen M-1 rocket engin
South Pole Telescope Software Systems: Control, Monitoring, and Data Acquisition
We present the software system used to control and operate the South Pole
Telescope. The South Pole Telescope is a 10-meter millimeter-wavelength
telescope designed to measure anisotropies in the cosmic microwave background
(CMB) at arcminute angular resolution. In the austral summer of 2011/12, the
SPT was equipped with a new polarization-sensitive camera, which consists of
1536 transition-edge sensor bolometers. The bolometers are read out using 36
independent digital frequency multiplexing (\dfmux) readout boards, each with
its own embedded processors. These autonomous boards control and read out data
from the focal plane with on-board software and firmware. An overall control
software system running on a separate control computer controls the \dfmux
boards, the cryostat and all other aspects of telescope operation. This control
software collects and monitors data in real-time, and stores the data to disk
for transfer to the United States for analysis
Flexible data input layer architecture (FDILA) for quick-response decision making tools in volatile manufacturing systems
This paper proposes the foundation for a flexible data input management system as a vital part of a generic solution for quick-response decision making. Lack of a comprehensive data input layer between data acquisition and processing systems has been realized and thought of. The proposed FDILA is applicable to a wide variety of volatile manufacturing environments. It provides a generic platform that enables systems designers to define any number of data entry points and types regardless of their make and specifications in a standard fashion. This is achieved by providing a variable definition layer immediately on top of the data acquisition layer and before data pre-processing layer. For proof of concept, National Instruments’ Labview data acquisition software is used to simulate a typical shop floor data acquisition system. The extracted data can then be fed into a data mining module that builds cost modeling functions involving the plant’s Key Performance Factors
The ngdp framework for data acquisition systems
The ngdp framework is intended to provide a base for the data acquisition
(DAQ) system software. The ngdp's design key features are: high modularity and
scalability; usage of the kernel context (particularly kernel threads) of the
operating systems (OS), which allows to avoid preemptive scheduling and
unnecessary memory--to--memory copying between contexts; elimination of
intermediate data storages on the media slower than the operating memory like
hard disks, etc. The ngdp, having the above properties, is suitable to organize
and manage data transportation and processing for needs of essentially
distributed DAQ systems. The investigation has been performed at the Veksler
and Baldin Laboratory of High Energy Physics, JINR.Comment: 21 pages, 3 figure
SWE bridge: software interface for plug & work instrument integration into marine observation platforms
The integration of sensor systems into marine
observation platforms such as gliders, cabled observatories
and smart buoys requires a great deal of effort due to the
diversity of architectures present in the marine acquisition
systems. In the past years important steps have been taken in
order to improve both standardization and interoperability,
i.e. the Open Geospatial Consortium’s Sensor Web
Enablement. This set of standards and protocols provide a
well
-defined framework to achieve standardized data chains.
However a significant gap is still present in the lower
-end of
the data chain, between the sensor systems and the
acquisition platforms. In this work a standard
s
-based
architecture to bridge this gap is proposed in order to achieve
plug & work, standardized and interoperable acquisition
systems.Award-winningPostprint (published version
Compressed Sensing for Tactile Skins
Whole body tactile perception via tactile skins offers large benefits for
robots in unstructured environments. To fully realize this benefit, tactile
systems must support real-time data acquisition over a massive number of
tactile sensor elements. We present a novel approach for scalable tactile data
acquisition using compressed sensing. We first demonstrate that the tactile
data is amenable to compressed sensing techniques. We then develop a solution
for fast data sampling, compression, and reconstruction that is suited for
tactile system hardware and has potential for reducing the wiring complexity.
Finally, we evaluate the performance of our technique on simulated tactile
sensor networks. Our evaluations show that compressed sensing, with a
compression ratio of 3 to 1, can achieve higher signal acquisition accuracy
than full data acquisition of noisy sensor data.Comment: 8 pages, 4 figures, submitted to ICRA1
Thermal (Silicon Diode) Data Acquisition Systems
Marshall Space Flight Center s X-ray Cryogenic Facility (XRCF) has been performing cryogenic testing to 20 Kelvin since 1999. Two configurations for acquiring data from silicon diode temperature sensors have been implemented at the facility. The facility's environment is recorded via a data acquisition system capable of reading up to 60 silicon diodes. Test article temperature is recorded by a second data acquisition system capable of reading 150+ silicon diodes. The specifications and architecture of both systems will be presented
The revolution in data gathering systems
Data acquisition systems used in NASA's wind tunnels from the 1950's through the present time are summarized as a baseline for assessing the impact of minicomputers and microcomputers on data acquisition and data processing. Emphasis is placed on the cyclic evolution in computer technology which transformed the central computer system, and finally the distributed computer system. Other developments discussed include: medium scale integration, large scale integration, combining the functions of data acquisition and control, and micro and minicomputers
Exploiting graphic processing units parallelism to improve intelligent data acquisition system performance in JET's correlation reflectometer
The performance of intelligent data acquisition systems relies heavily on their processing capabilities and local bus bandwidth, especially in applications with high sample rates or high number of channels. This is the case of the self adaptive sampling rate data acquisition system installed as a pilot experiment in KG8B correlation reflectometer at JET. The system, which is based on the ITMS platform, continuously adapts the sample rate during the acquisition depending on the signal bandwidth. In order to do so it must transfer acquired data to a memory buffer in the host processor and run heavy computational algorithms for each data block. The processing capabilities of the host CPU and the bandwidth of the PXI bus limit the maximum sample rate that can be achieved, therefore limiting the maximum bandwidth of the phenomena that can be studied. Graphic processing units (GPU) are becoming an alternative for speeding up compute intensive kernels of scientific, imaging and simulation applications. However, integrating this technology into data acquisition systems is not a straight forward step, not to mention exploiting their parallelism efficiently. This paper discusses the use of GPUs with new high speed data bus interfaces to improve the performance of the self adaptive sampling rate data acquisition system installed on JET. Integration issues are discussed and performance evaluations are presente
- …
