29,004 research outputs found

    Bit error rate measurement above and below bit rate tracking threshold

    Get PDF
    Bit error rate is measured by sending a pseudo-random noise (PRN) code test signal simulating digital data through digital equipment to be tested. An incoming signal representing the response of the equipment being tested, together with any added noise, is received and tracked by being compared with a locally generated PRN code. Once the locally generated PRN code matches the incoming signal a tracking lock is obtained. The incoming signal is then integrated and compared bit-by-bit against the locally generated PRN code and differences between bits being compared are counted as bit errors

    High resolution radiometric measurements of convective storms during the GATE experiment

    Get PDF
    Using passive microwave data from the NASA CV-990 aircraft and radar data collected during the Global Atmospheric Research Program Atlantic Tropical Experiment (GATE), an empirical model was developed relating brightness temperatures sensed at 19.35 GHz to surface rainfall rates. This model agreed well with theoretical computations of the relationship between microwave radiation and precipitation in the tropics. The GATE aircraft microwave data was then used to determine the detailed structure of convective systems. The high spatial resolution of the data permitted identification of individual cells which retained unique identities throughout their lifetimes in larger cloud masses and allowed analysis of the effects of cloud merger

    Quantum computing with nearest neighbor interactions and error rates over 1%

    Full text link
    Large-scale quantum computation will only be achieved if experimentally implementable quantum error correction procedures are devised that can tolerate experimentally achievable error rates. We describe a quantum error correction procedure that requires only a 2-D square lattice of qubits that can interact with their nearest neighbors, yet can tolerate quantum gate error rates over 1%. The precise maximum tolerable error rate depends on the error model, and we calculate values in the range 1.1--1.4% for various physically reasonable models. Even the lowest value represents the highest threshold error rate calculated to date in a geometrically constrained setting, and a 50% improvement over the previous record.Comment: 4 pages, 8 figure

    A joint model for vehicle type and fuel type choice: evidence from a cross-nested logit study

    No full text
    In the face of growing concerns about greenhouse gas emissions, there is increasing interest in forecasting the likely demand for alternative fuel vehicles. This paper presents an analysis carried out on stated preference survey data on California consumer responses to a joint vehicle type choice and fuel type choice experiment. Our study recognises the fact that this choice process potentially involves high correlations that an analyst may not be able to adequately represent in the modelled utility components. We further hypothesise that a cross-nested logit structure can capture more of the correlation patterns than the standard nested logit model structure in such a multi-dimensional choice process. Our empirical analysis and a brief forecasting exercise produce evidence to support these assertions. The implications of these findings extend beyond the context of the demand for alternative fuel vehicles to the analysis of multi-dimensional choice processes in general. Finally, an extension verifies that further gains can be made by using mixed GEV structures, allowing for random heterogeneity in addition to the flexible correlation structures

    Improving Table Compression with Combinatorial Optimization

    Full text link
    We study the problem of compressing massive tables within the partition-training paradigm introduced by Buchsbaum et al. [SODA'00], in which a table is partitioned by an off-line training procedure into disjoint intervals of columns, each of which is compressed separately by a standard, on-line compressor like gzip. We provide a new theory that unifies previous experimental observations on partitioning and heuristic observations on column permutation, all of which are used to improve compression rates. Based on the theory, we devise the first on-line training algorithms for table compression, which can be applied to individual files, not just continuously operating sources; and also a new, off-line training algorithm, based on a link to the asymmetric traveling salesman problem, which improves on prior work by rearranging columns prior to partitioning. We demonstrate these results experimentally. On various test files, the on-line algorithms provide 35-55% improvement over gzip with negligible slowdown; the off-line reordering provides up to 20% further improvement over partitioning alone. We also show that a variation of the table compression problem is MAX-SNP hard.Comment: 22 pages, 2 figures, 5 tables, 23 references. Extended abstract appears in Proc. 13th ACM-SIAM SODA, pp. 213-222, 200

    Topological code Autotune

    Full text link
    Many quantum systems are being investigated in the hope of building a large-scale quantum computer. All of these systems suffer from decoherence, resulting in errors during the execution of quantum gates. Quantum error correction enables reliable quantum computation given unreliable hardware. Unoptimized topological quantum error correction (TQEC), while still effective, performs very suboptimally, especially at low error rates. Hand optimizing the classical processing associated with a TQEC scheme for a specific system to achieve better error tolerance can be extremely laborious. We describe a tool Autotune capable of performing this optimization automatically, and give two highly distinct examples of its use and extreme outperformance of unoptimized TQEC. Autotune is designed to facilitate the precise study of real hardware running TQEC with every quantum gate having a realistic, physics-based error model.Comment: 13 pages, 17 figures, version accepted for publicatio

    Modelling Organic Dairy Production Systems

    Get PDF
    In this study, a large number of organic dairy production strategies were compared in terms of physical and financial performance through the integrated use of computer simulation models and organic case study farm data. Production and financial data from three organic case study farms were used as a basis for the modelling process to ensure that the modelled systems were based on real sets of resources that might be available to a farmer. The case study farms were selected to represent a range of farming systems in terms of farm size, concentrate use and location. This paper describes the process used to model the farm systems: the integration of the three models used and the use of indicators to assess the modelled farm systems in terms of physical sustainability and financial performance

    Refactoring Legacy JavaScript Code to Use Classes: The Good, The Bad and The Ugly

    Full text link
    JavaScript systems are becoming increasingly complex and large. To tackle the challenges involved in implementing these systems, the language is evolving to include several constructions for programming- in-the-large. For example, although the language is prototype-based, the latest JavaScript standard, named ECMAScript 6 (ES6), provides native support for implementing classes. Even though most modern web browsers support ES6, only a very few applications use the class syntax. In this paper, we analyze the process of migrating structures that emulate classes in legacy JavaScript code to adopt the new syntax for classes introduced by ES6. We apply a set of migration rules on eight legacy JavaScript systems. In our study, we document: (a) cases that are straightforward to migrate (the good parts); (b) cases that require manual and ad-hoc migration (the bad parts); and (c) cases that cannot be migrated due to limitations and restrictions of ES6 (the ugly parts). Six out of eight systems (75%) contain instances of bad and/or ugly cases. We also collect the perceptions of JavaScript developers about migrating their code to use the new syntax for classes.Comment: Paper accepted at 16th International Conference on Software Reuse (ICSR), 2017; 16 page

    Extension of four-dimensional atmospheric models

    Get PDF
    The cloud data bank, the 4-D atmospheric model, and a set of computer programs designed to simulate meteorological conditions for any location above the earth are described in turns of space vehicle design and simulation of vehicle reentry trajectories. Topics discussed include: the relationship between satellite and surface observed cloud cover using LANDSAT 1 photographs and including the effects of cloud shadows; extension of the 4-D model to the altitude of 52 km; and addition of the u and v wind components to the 4-D model of means and variances at 1 km levels from the surface to 25 km. Results of the cloud cover analysis are presented along with the stratospheric model and the tropospheric wind profiles
    corecore