2,099 research outputs found

    Visibility in space - Target description subroutine

    Get PDF
    Computer subroutine for use in calculating visibility of Lunar Excursion Module /LEM/ ASCENT stage during moon orbit rendezvous with Command Service Module /CSM

    An Efficient Algorithm For Chinese Postman Walk on Bi-directed de Bruijn Graphs

    Full text link
    Sequence assembly from short reads is an important problem in biology. It is known that solving the sequence assembly problem exactly on a bi-directed de Bruijn graph or a string graph is intractable. However finding a Shortest Double stranded DNA string (SDDNA) containing all the k-long words in the reads seems to be a good heuristic to get close to the original genome. This problem is equivalent to finding a cyclic Chinese Postman (CP) walk on the underlying un-weighted bi-directed de Bruijn graph built from the reads. The Chinese Postman walk Problem (CPP) is solved by reducing it to a general bi-directed flow on this graph which runs in O(|E|2 log2(|V |)) time. In this paper we show that the cyclic CPP on bi-directed graphs can be solved without reducing it to bi-directed flow. We present a ?(p(|V | + |E|) log(|V |) + (dmaxp)3) time algorithm to solve the cyclic CPP on a weighted bi-directed de Bruijn graph, where p = max{|{v|din(v) - dout(v) > 0}|, |{v|din(v) - dout(v) < 0}|} and dmax = max{|din(v) - dout(v)}. Our algorithm performs asymptotically better than the bidirected flow algorithm when the number of imbalanced nodes p is much less than the nodes in the bi-directed graph. From our experimental results on various datasets, we have noticed that the value of p/|V | lies between 0.08% and 0.13% with 95% probability

    EZ: A Tool for Automatic Redshift Measurement

    Full text link
    We present EZ (Easy redshift), a tool we have developed within the VVDS project to help in redshift measurement from otpical spectra. EZ has been designed with large spectroscopic surveys in mind, and in its development particular care has been given to the reliability of the results obtained in an automatic and unsupervised mode. Nevertheless, the possibility of running it interactively has been preserved, and a graphical user interface for results inspection has been designed. EZ has been successfully used within the VVDS project, as well as the zCosmos one. In this paper we describe its architecture and the algorithms used, and evaluate its performances both on simulated and real data. EZ is an open source program, freely downloadable from http://cosmos.iasf-milano.inaf.it/pandora.Comment: accepted for publication in Publications of the Astronomical Society of the Pacifi

    WP 76 - Comparing different weighting procedures for volunteer web surveys

    Get PDF
    The strengths and weaknesses of web surveys have been widely described in the literature. Of particular interest is the question to which degree the obtained results can be generalised for the whole population? To deal with this problem weighting adjustments, like post-stratification and propensity score adjustment (PSA) have been seen as a possible solution. In the scientific community, however, particularly PSA has traditionally not been applied in the field of surveys, and there has been a minimal amount of evidence for its applicability and performance, and the implications are not conclusive. Against this background, the paper attempts to explore the two statistical weighting procedures for the German and Dutch WageIndicator Survey 2006. To evaluate the effectiveness of the weighting techniques in adjusting biases arising from non-randomised sample selection, the existing selection bias has been explored and the efficiency of the weights has be tested by comparing un-weighted and weighted results with those that could be found using data from the German SOEP and the Dutch OSA Panel for the same year. The results reveal that the impact of the applied weights is very limited and that the different weighting methods using balancing variables do not make web survey data more comparable to the general population. This holds for the German as well as for the Dutch sample.

    Error-corrected quantum sensing

    Get PDF
    Quantum metrology has many important applications in science and technology, ranging from frequency spectroscopy to gravitational wave detection. Quantum mechanics imposes a fundamental limit on measurement precision, called the Heisenberg limit, which can be achieved for noiseless quantum systems, but is not achievable in general for systems subject to noise. Here we study how measurement precision can be enhanced through quantum error correction, a general method for protecting a quantum system from the damaging effects of noise. We find a necessary and sufficient condition for achieving the Heisenberg limit using quantum probes subject to Markovian noise, assuming that noiseless ancilla systems are available, and that fast, accurate quantum processing can be performed. When the sufficient condition is satisfied, the quantum error-correcting code achieving the best possible precision can be found by solving a semidefinite program. We also show that noiseless ancilla are not needed when the signal Hamiltonian and the error operators commute. Finally we provide two explicit, archetypal examples of quantum sensors: qubits undergoing dephasing and a lossy bosonic mode

    Using interaction-based readouts to approach the ultimate limit of detection noise robustness for quantum-enhanced metrology in collective spin systems

    Get PDF
    We consider the role of detection noise in quantum-enhanced metrology in collective spin systems, and derive a fundamental bound for the maximum obtainable sensitivity for a given level of added detection noise. We then present an interaction-based readout utilising the commonly used one-axis twisting scheme that approaches this bound for states generated via several commonly considered methods of generating quantum enhancement, such as one-axis twisting, two-axis counter-twisting, twist-and-turn squeezing, quantum non-demolition measurements, and adiabatically scanning through a quantum phase transition. We demonstrate that this method performs significantly better than other recently proposed interaction-based readouts. These results may help provide improved sensitivity for quantum sensing devices in the presence of unavoidable detection noise.This work was supported by Australian Research Council Discovery Project DP190101709

    Relativistic model of hidden bottom tetraquarks

    Full text link
    The relativistic model of the ground state and excited heavy tetraquarks with hidden bottom is formulated within the diquark-antidiquark picture. The diquark structure is taken into account by calculating the diquark-gluon vertex in terms of the diquark wave functions. Predictions for the masses of bottom counterparts to the charm tetraquark candidates are given.Comment: 6 page

    Implementasi Restful API dalam Upaya Mensinkronisasi Data pada Sistem Otomasi Perpustakaan Berbasis Web Dengan Menggunakan Metode Uji Coba RMSE Dan White Box

    Get PDF
    To solve the data redundancy problem that occurs in the STT - PLN library automation system, namely the Senayan and Cendana systems, currently a web service has been designed to combine the two automation systems in the library so that data redundancy from the two systems will not occur. This system was created in order to prepare a database for the implementation of a new automation system for the STT - PLN library, namely the Akasia system. This web service application can migrate and merge data from the Senayan and Cendana databases to the Akasia database. This system also supports the creation of additional fields and tables for fields and tables that were not yet available in the initial structure of the acacia database. The application of the REST architecture here is to test the ability of the REST architecture in handling data migration and merging with large amounts of data. The test is carried out on a local host and through the existing LAN network in the IT - PLN building. There are two types of testing, namely speed testing and validation testing. The speed test is done by calculating the estimated time spent when transferring data with the amount of data ranging from 4000, 6000, 8000, 10000.The result is that using the REST architecture in migrating data can shorten the migration time because the data transferred from the source database is in JSON format. which has a smaller size than the data in SQL format. The data in JSON format is then converted into an array by the RESTful API and then entered into the acacia database. So that the migration process has a short time because the process of transferring data from the source database server to the RESTful API server is in JSON format. While the results of data validation testing using RMSE show a very small number, namely 0. From these results it can be stated that the acacia database from the results of the localhost migration already has sufficient data accuracy to be used as a new system database
    corecore