1,274 research outputs found
Partial reflections of radio waves from the lower ionosphere
The addition of phase difference measurements to partial reflection experiments is discussed, and some advantages of measuring electron density this way are pointed out. The additional information obtained reduces the requirement for an accurate predetermination of collision frequency. Calculations are also made to estimate the errors expected in partial-reflection experiments due to the assumption of Fresnel reflection and to the neglect of coupling between modes. In both cases, the errors are found to be of the same order as known errors in the measurements due to current instrumental limitations
Safe and Private Data Sharing with Turtle: Friends Team-Up and Beat the System
In this paper we describe Turtle, a peer-to-peer architecture for safe sharing of sensitive data. The truly revolutionary aspect of Turtle rests in its novel way of dealing with trust issues: while existing peer-to-peer architectures with similar aims attempt to build trust relationships on top of the basic, trust-agnostic, peer-topeer overlay, Turtle takes the opposite approach, and builds its overlay on top of pre-existent trust relationships among its users. This allows both data sender and receiver anonymity, while also protecting each and every intermediate relay in the data query path. Furthermore, its unique trust model allows Turtle to withstand most of the denial of service attacks that plague other peer-to-peer data sharing networks.
Secure data replication over untrusted hosts
In the Internet age, data replication is a popular technique for achieving fault tolerance and improved performance. With the advent of content delivery networks, it is becoming more and more frequent that data content is placed on hosts that are not directly controlled by the content owner, and because of this, security mechanisms to protect data integrity are necessary. In this paper we present a system architecture that allows arbitrary queries to be supported on data content replicated on untrusted servers. To prevent these servers from returning erroneous answers to client queries, we make use of a small number of trusted hosts that randomly check these answers and take corrective action whenever necessary. Additionally, our system employs an audit mechanism that guarantees that any untrusted server acting maliciously will eventually be detected and excluded from the system
étudiants à la médiathèque de Tréfilerie à Saint-Étienne (Les)
The concept of an Ephemerizer system has been introduced in earlier works as a mechanism to ensure that a file deleted from the persistent storage remains unrecoverable. The principle involved storing the data in an encrypted form in the user’s machine and the key to decrypt the data in a physically separate machine. However the schemes proposed so far do not provide support for fine-grained user settings on the lifetime of the data nor support any mechanism to check the integrity of the system that is using the secret data. In addition we report the presence of a vulnerability in one version of the proposed scheme that can be exploited by an attacker to nullify the ephemeral nature of the keys. We propose and discuss in detail an alternate Identity Based cryptosystem powered scheme that overcomes the identified limitations of the original system
Classification in sparse, high dimensional environments applied to distributed systems failure prediction
Network failures are still one of the main causes of distributed systems’ lack of reliability. To overcome this problem we present an improvement over a failure prediction system, based on Elastic Net Logistic Regression and the application of rare events prediction techniques, able to work with sparse, high dimensional datasets. Specifically, we prove its stability, fine tune its hyperparameter and improve its industrial utility by showing that, with a slight change in dataset creation, it can also predict the location of a failure, a key asset when trying to take a proactive approach to failure management
Priority diffusion model in lattices and complex networks
We introduce a model for diffusion of two classes of particles ( and )
with priority: where both species are present in the same site the motion of
's takes precedence over that of 's. This describes realistic situations
in wireless and communication networks. In regular lattices the diffusion of
the two species is normal but the particles are significantly slower, due
to the presence of the particles. From the fraction of sites where the
particles can move freely, which we compute analytically, we derive the
diffusion coefficients of the two species. In heterogeneous networks the
fraction of sites where is free decreases exponentially with the degree of
the sites. This, coupled with accumulation of particles in high-degree nodes
leads to trapping of the low priority particles in scale-free networks.Comment: 5 pages, 3 figure
Spacings of Quarkonium Levels with the Same Principal Quantum Number
The spacings between bound-state levels of the Schr\"odinger equation with
the same principal quantum number but orbital angular momenta
differing by unity are found to be nearly equal for a wide range of power
potentials , with . Semiclassical approximations are in accord with this behavior. The
result is applied to estimates of masses for quarkonium levels which have not
yet been observed, including the 2P states and the 1D
states.Comment: 20 pages, latex, 3 uuencoded figures submitted separately (process
using psfig.sty
Right Handed Weak Currents in Sum Rules for Axialvector Constant Renormalization
The recent experimental results on deep inelastic polarized lepton scattering
off proton, deuteron and He together with polari% zed neutron
-decay data are analyzed. It is shown that the problem of Ellis-Jaffe
and Bjorken sum rules deficiency and the neutron paradox could be solved
simultaneously by assuming the small right handed current (RHC) admixture in
the weak interaction Lagrangian. The possible RHC impact on pion-nucleon
-term and Gamow-Teller sum rule for nuclear reactions is
pointed out.Comment: to be published in Phys. Rev. Lett. LaTeX, 8 pages, 21 k
Process evaluation for complex interventions in primary care: understanding trials using the normalization process model
Background: the Normalization Process Model is a conceptual tool intended to assist in understanding the factors that affect implementation processes in clinical trials and other evaluations of complex interventions. It focuses on the ways that the implementation of complex interventions is shaped by problems of workability and integration.Method: in this paper the model is applied to two different complex trials: (i) the delivery of problem solving therapies for psychosocial distress, and (ii) the delivery of nurse-led clinics for heart failure treatment in primary care.Results: application of the model shows how process evaluations need to focus on more than the immediate contexts in which trial outcomes are generated. Problems relating to intervention workability and integration also need to be understood. The model may be used effectively to explain the implementation process in trials of complex interventions.Conclusion: the model invites evaluators to attend equally to considering how a complex intervention interacts with existing patterns of service organization, professional practice, and professional-patient interaction. The justification for this may be found in the abundance of reports of clinical effectiveness for interventions that have little hope of being implemented in real healthcare setting
Electromagnetic form factors in the J/\psi mass region: The case in favor of additional resonances
Using the results of our recent analysis of e^+e^- annihilation, we plot the
curves for the diagonal and transition form factors of light hadrons in the
time-like region up to the production threshold of an open charm quantum
number. The comparison with existing data on the decays of J/\psi into such
hadrons shows that some new resonance structures may be present in the mass
range between 2 GeVand the J/\psi mass. Searching them may help in a better
understanding of the mass spectrum in both the simple and a more sophisticated
quark models, and in revealing the details of the three-gluon mechanism of the
OZI rule breaking in K\bar K channel.Comment: Formulas are added, typo is corrected, the text is rearranged.
Replaced to match the version accepted in Phys Rev
- …
