871 research outputs found
Partial reflections of radio waves from the lower ionosphere
The addition of phase difference measurements to partial reflection experiments is discussed, and some advantages of measuring electron density this way are pointed out. The additional information obtained reduces the requirement for an accurate predetermination of collision frequency. Calculations are also made to estimate the errors expected in partial-reflection experiments due to the assumption of Fresnel reflection and to the neglect of coupling between modes. In both cases, the errors are found to be of the same order as known errors in the measurements due to current instrumental limitations
Secure data replication over untrusted hosts
In the Internet age, data replication is a popular technique for achieving fault tolerance and improved performance. With the advent of content delivery networks, it is becoming more and more frequent that data content is placed on hosts that are not directly controlled by the content owner, and because of this, security mechanisms to protect data integrity are necessary. In this paper we present a system architecture that allows arbitrary queries to be supported on data content replicated on untrusted servers. To prevent these servers from returning erroneous answers to client queries, we make use of a small number of trusted hosts that randomly check these answers and take corrective action whenever necessary. Additionally, our system employs an audit mechanism that guarantees that any untrusted server acting maliciously will eventually be detected and excluded from the system
Safe and Private Data Sharing with Turtle: Friends Team-Up and Beat the System
In this paper we describe Turtle, a peer-to-peer architecture for safe sharing of sensitive data. The truly revolutionary aspect of Turtle rests in its novel way of dealing with trust issues: while existing peer-to-peer architectures with similar aims attempt to build trust relationships on top of the basic, trust-agnostic, peer-topeer overlay, Turtle takes the opposite approach, and builds its overlay on top of pre-existent trust relationships among its users. This allows both data sender and receiver anonymity, while also protecting each and every intermediate relay in the data query path. Furthermore, its unique trust model allows Turtle to withstand most of the denial of service attacks that plague other peer-to-peer data sharing networks.
étudiants à la médiathèque de Tréfilerie à Saint-Étienne (Les)
The concept of an Ephemerizer system has been introduced in earlier works as a mechanism to ensure that a file deleted from the persistent storage remains unrecoverable. The principle involved storing the data in an encrypted form in the user’s machine and the key to decrypt the data in a physically separate machine. However the schemes proposed so far do not provide support for fine-grained user settings on the lifetime of the data nor support any mechanism to check the integrity of the system that is using the secret data. In addition we report the presence of a vulnerability in one version of the proposed scheme that can be exploited by an attacker to nullify the ephemeral nature of the keys. We propose and discuss in detail an alternate Identity Based cryptosystem powered scheme that overcomes the identified limitations of the original system
METAREP: JCVI metagenomics reports—an open source tool for high-performance comparative metagenomics
Summary: JCVI Metagenomics Reports (METAREP) is a Web 2.0 application designed to help scientists analyze and compare annotated metagenomics datasets. It utilizes Solr/Lucene, a high-performance scalable search engine, to quickly query large data collections. Furthermore, users can use its SQL-like query syntax to filter and refine datasets. METAREP provides graphical summaries for top taxonomic and functional classifications as well as a GO, NCBI Taxonomy and KEGG Pathway Browser. Users can compare absolute and relative counts of multiple datasets at various functional and taxonomic levels. Advanced comparative features comprise statistical tests as well as multidimensional scaling, heatmap and hierarchical clustering plots. Summaries can be exported as tab-delimited files, publication quality plots in PDF format. A data management layer allows collaborative data analysis and result sharing
Classification in sparse, high dimensional environments applied to distributed systems failure prediction
Network failures are still one of the main causes of distributed systems’ lack of reliability. To overcome this problem we present an improvement over a failure prediction system, based on Elastic Net Logistic Regression and the application of rare events prediction techniques, able to work with sparse, high dimensional datasets. Specifically, we prove its stability, fine tune its hyperparameter and improve its industrial utility by showing that, with a slight change in dataset creation, it can also predict the location of a failure, a key asset when trying to take a proactive approach to failure management
Spacings of Quarkonium Levels with the Same Principal Quantum Number
The spacings between bound-state levels of the Schr\"odinger equation with
the same principal quantum number but orbital angular momenta
differing by unity are found to be nearly equal for a wide range of power
potentials , with . Semiclassical approximations are in accord with this behavior. The
result is applied to estimates of masses for quarkonium levels which have not
yet been observed, including the 2P states and the 1D
states.Comment: 20 pages, latex, 3 uuencoded figures submitted separately (process
using psfig.sty
Priority diffusion model in lattices and complex networks
We introduce a model for diffusion of two classes of particles ( and )
with priority: where both species are present in the same site the motion of
's takes precedence over that of 's. This describes realistic situations
in wireless and communication networks. In regular lattices the diffusion of
the two species is normal but the particles are significantly slower, due
to the presence of the particles. From the fraction of sites where the
particles can move freely, which we compute analytically, we derive the
diffusion coefficients of the two species. In heterogeneous networks the
fraction of sites where is free decreases exponentially with the degree of
the sites. This, coupled with accumulation of particles in high-degree nodes
leads to trapping of the low priority particles in scale-free networks.Comment: 5 pages, 3 figure
- …