28,279 research outputs found

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use

    Stochastic Database Cracking: Towards Robust Adaptive Indexing in Main-Memory Column-Stores

    Get PDF
    Modern business applications and scientific databases call for inherently dynamic data storage environments. Such environments are characterized by two challenging features: (a) they have little idle system time to devote on physical design; and (b) there is little, if any, a priori workload knowledge, while the query and data workload keeps changing dynamically. In such environments, traditional approaches to index building and maintenance cannot apply. Database cracking has been proposed as a solution that allows on-the-fly physical data reorganization, as a collateral effect of query processing. Cracking aims to continuously and automatically adapt indexes to the workload at hand, without human intervention. Indexes are built incrementally, adaptively, and on demand. Nevertheless, as we show, existing adaptive indexing methods fail to deliver workload-robustness; they perform much better with random workloads than with others. This frailty derives from the inelasticity with which these approaches interpret each query as a hint on how data should be stored. Current cracking schemes blindly reorganize the data within each query's range, even if that results into successive expensive operations with minimal indexing benefit. In this paper, we introduce stochastic cracking, a significantly more resilient approach to adaptive indexing. Stochastic cracking also uses each query as a hint on how to reorganize data, but not blindly so; it gains resilience and avoids performance bottlenecks by deliberately applying certain arbitrary choices in its decision-making. Thereby, we bring adaptive indexing forward to a mature formulation that confers the workload-robustness previous approaches lacked. Our extensive experimental study verifies that stochastic cracking maintains the desired properties of original database cracking while at the same time it performs well with diverse realistic workloads.Comment: VLDB201

    Full-Stokes polarimetry with circularly polarized feeds - Sources with stable linear and circular polarization in the GHz regime

    Get PDF
    We present a pipeline that allows recovering reliable information for all four Stokes parameters with high accuracy. Its novelty relies on the treatment of the instrumental effects already prior to the computation of the Stokes parameters contrary to conventional methods, such as the M\"uller matrix one. The instrumental linear polarization is corrected across the whole telescope beam and significant Stokes QQ and UU can be recovered even when the recorded signals are severely corrupted. The accuracy we reach in terms of polarization degree is of the order of 0.1-0.2 %. The polarization angles are determined with an accuracy of almost 1^{\circ}. The presented methodology was applied to recover the linear and circular polarization of around 150 Active Galactic Nuclei. The sources were monitored from July 2010 to April 2016 with the Effelsberg 100-m telescope at 4.85 GHz and 8.35 GHz with a cadence of around 1.2 months. The polarized emission of the Moon was used to calibrate the polarization angle. Our analysis showed a small system-induced rotation of about 1^{\circ} at both observing frequencies. Finally, we identify five sources with significant and stable linear polarization; three sources remain constantly linearly unpolarized over the period we examined; a total of 11 sources have stable circular polarization degree mcm_\mathrm{c} and four of them with non-zero mcm_\mathrm{c}. We also identify eight sources that maintain a stable polarization angle over the examined period. All this is provided to the community for polarization observations reference. We finally show that our analysis method is conceptually different from the traditionally used ones and performs better than the M\"uller matrix method. Although it was developed for a system equipped with circularly polarized feeds it can easily be modified for systems with linearly polarized feeds as well.Comment: 19 pages, 17 figures, accepted for publication in Astronomy & Astrophysics on May 30, 201

    Survey on encode biometric data for transmission in wireless communication networks

    Get PDF
    The aim of this research survey is to review an enhanced model supported by artificial intelligence to encode biometric data for transmission in wireless communication networks can be tricky as performance decreases with increasing size due to interference, especially if channels and network topology are not selected carefully beforehand. Additionally, network dissociations may occur easily if crucial links fail as redundancy is neglected for signal transmission. Therefore, we present several algorithms and its implementation which addresses this problem by finding a network topology and channel assignment that minimizes interference and thus allows a deployment to increase its throughput performance by utilizing more bandwidth in the local spectrum by reducing coverage as well as connectivity issues in multiple AI-based techniques. Our evaluation survey shows an increase in throughput performance of up to multiple times or more compared to a baseline scenario where an optimization has not taken place and only one channel for the whole network is used with AI-based techniques. Furthermore, our solution also provides a robust signal transmission which tackles the issue of network partition for coverage and for single link failures by using airborne wireless network. The highest end-to-end connectivity stands at 10 Mbps data rate with a maximum propagation distance of several kilometers. The transmission in wireless network coverage depicted with several signal transmission data rate with 10 Mbps as it has lowest coverage issue with moderate range of propagation distance using enhanced model to encode biometric data for transmission in wireless communication
    corecore