1,062 research outputs found
A Survey on Homomorphic Encryption Schemes: Theory and Implementation
Legacy encryption systems depend on sharing a key (public or private) among
the peers involved in exchanging an encrypted message. However, this approach
poses privacy concerns. Especially with popular cloud services, the control
over the privacy of the sensitive data is lost. Even when the keys are not
shared, the encrypted material is shared with a third party that does not
necessarily need to access the content. Moreover, untrusted servers, providers,
and cloud operators can keep identifying elements of users long after users end
the relationship with the services. Indeed, Homomorphic Encryption (HE), a
special kind of encryption scheme, can address these concerns as it allows any
third party to operate on the encrypted data without decrypting it in advance.
Although this extremely useful feature of the HE scheme has been known for over
30 years, the first plausible and achievable Fully Homomorphic Encryption (FHE)
scheme, which allows any computable function to perform on the encrypted data,
was introduced by Craig Gentry in 2009. Even though this was a major
achievement, different implementations so far demonstrated that FHE still needs
to be improved significantly to be practical on every platform. First, we
present the basics of HE and the details of the well-known Partially
Homomorphic Encryption (PHE) and Somewhat Homomorphic Encryption (SWHE), which
are important pillars of achieving FHE. Then, the main FHE families, which have
become the base for the other follow-up FHE schemes are presented. Furthermore,
the implementations and recent improvements in Gentry-type FHE schemes are also
surveyed. Finally, further research directions are discussed. This survey is
intended to give a clear knowledge and foundation to researchers and
practitioners interested in knowing, applying, as well as extending the state
of the art HE, PHE, SWHE, and FHE systems.Comment: - Updated. (October 6, 2017) - This paper is an early draft of the
survey that is being submitted to ACM CSUR and has been uploaded to arXiv for
feedback from stakeholder
10161 Abstracts Collection -- Decision Procedures in Software, Hardware and Bioware
From April 19th, 2010 to April 23rd, 2010, the Dagstuhl Seminar 10161
"Decision Procedures in Soft, Hard and Bio-ware"
was held in Schloss Dagstuhl Leibniz Center for Informatics.
During the seminar, several participants presented their current research,
and ongoing work and open problems were discussed. Abstracts of the
presentations given during the seminar as well as links to slides and links to
papers behind the presentations and papers produced as a result
of the seminar are put together in this paper. The first section describes
the seminar topics and goals in general. Links to extended abstracts or
full papers are provided, if available
Recommended from our members
On the capture and representation of fonts
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.The commercial need to capture, process and represent the shape and form of an outline has lead to the development of a number of spline routines. These use a mathematical curve format that approximates the contours of a given shape. The modelled outline lends itself to be used on, and for, a variety of purposes. These include graphic screens, laser printers and numerically controlled machines. The latter can be employed for cutting foil, metal. plastic and stone. One of the most widely used software design packages has been the lKARUS system. This, developed by URW of Hamburg (Gennany), employs a number of mathematical descriptions that facilitate the process of both modelling and representation of font characters. It uses a variety of curve formats, including Bezier cubics, general conics and parabolics. The work reported in this dissertation focuses on developing improved techniques, primarily. for the lKARUS system. This includes two algorithms
which allow a Bezier cubic description, two for a general conic representation and, yet another, two for the parabolic case. In addition, a number of algorithms are presented which promote conversions between these mathematical forms; for example, Bezier cubics to a general conic form. Furthennore, algorithms are developed to assist the process of rasterising both cubic and quadratic arcs.This study was partly funded by the Science and Education Research Council (SERC)
Multilinear Maps in Cryptography
Multilineare Abbildungen spielen in der modernen Kryptographie eine immer bedeutendere Rolle. In dieser Arbeit wird auf die Konstruktion, Anwendung und Verbesserung von multilinearen Abbildungen eingegangen
Optimal Sizing and Location of Static and Dynamic Reactive Power Compensation
The key of reactive power planning (RPP), or Var planning, is the optimal allocation of reactive power sources considering location and size. Traditionally, the locations for placing new Var sources were either simply estimated or directly assumed. Recent research works have presented some rigorous optimization-based methods in RPP. Different constraints are the key of various optimization models, identified as Optimal Power Flow (OPF) model, Security Constrained OPF (SCOPF) model, and Voltage Stability Constrained OPF model (VSCOPF).
First, this work investigates the economic benefits from local reactive power compensation including reduced losses, shifting reactive power flow to real power flow, and increased transfer capability. Then, the benefits in the three categories are applied to Var planning considering different locations and amounts of Var compensation in an enumeration method, but many OPF runs are needed.
Then, the voltage stability constrained OPF (VSCOPF) model with two sets of variables is used to achieve an efficient model. The two sets of variables correspond to the “normal operating point (o)” and “collapse point (*)” respectively. Finally, an interpolation approximation method is adopted to simplify the previous VSCOPF model by approximating the TTC function, therefore, eliminating the set of variables and constraints related to the “collapse point”. In addition, interpolation method is compared with the least square method in the literature to show its advantages. It is also interesting to observe that the test results from a seven-bus system show that it is not always economically efficient if Var compensation increases continuously
Theory and applications of multi-dimensional stationary stochastic processes
The theory of stationary stochastic processes in several dimensions
has been investigated to provide a general model which may be applied to
various problems which involve unknown functions of several variables.
In particular, when values of the function are known only at a finite set
of points, treating the unknown function as a realisation of a stationary
stochastic process leads to an interpolating function which reproduces the
values exactly at the given points. With suitable choice of auto-correlation
for the model, the interpolating function may also he shown to be continuous
in all its derivatives everywhere. A few parameters only need to be found
for the interpolator, and these may be estimated from the given data.
One problem tackled using such an interpolator is that of automatic
contouring of functions of two variables from arbitrarily scattered data
points. A "two-stage" model was developed, which incorporates a long-range "trend" component as well as a shorter-range "residual" term. This leads
to a contouring algorithm which gives good results with difficult data.
The second area of application is that of optimisation, particularly of
objective functions which are expensive to compute. Since the interpolator
gives an estimate of the derivatives with little work, it is simple to
optimise it using conventional techniques, and to re-evaluate the true
function at the apparent optimum point. An iterative algorithm along these
lines gives good results with test functions, especially with fuactions of
more than two variables. A program has been developed whicj incorporates
both the optimisation and contouring applications into a single peckage.
Finally, the theory of excursions of a stationary process above a
fixed level has been applied to the problem of modelling the occurrence
of oilfields, with special reference to their spatial distribution and
tendency to cluster. An intuitively reasonable model with few parameters
has been developed and applied to North Sea data, with interesting results
- …