18,747 research outputs found
Prestress loss of post-tensioned clay diaphragm and fin brickwork
Unlike calcium silicate and concrete block masonry which undergo shrinkage with time, clay brickwork has been known to expand instead. Expansion of brick units in prestressed masonry will cause an increase in the prestressing force instead of prestress loss. However, not all clay brickwork expand with time; higher strength clay units tend to undergo shrinkage with time. The main objective of this paper is to present experimental data obtained for prestress loss in post-tensioned high strength clay diaphragm and fin brickwork. The main objective of this paper is to present experimental data obtained for prestress loss in posttensioned high strength clay diaphragm and fin brickwork. The brickwork were built from clmisB clay engineering brick with compressive strength of 103 MPa with designation (ii) mortar. The tests which involve monitoring prestress loss, creep and shrinkage of clay sections were carried out over a period of 120 days. Usiog the 120-day experimental data, the predicted long-term prestress loss is 20
The rehabilitation of a Victorian clay brick railway viaduct
Larpool viaduct is a 13 span clay brick viaduct built between 1882 and 1884 to carry the Scarborough and Whitby railway across the picturesque Esk Valley in Whitby, North Yorkshire, England. The structure is of multi-ring clay brick arch construction supported on solid brickwork piers founded on mass concrete or concrete filled brickwork caissons. The railway was closed to rail traffic in 1965 but was re-opened to pedestrian and cycle traffic in 2000; it is now part of a regional sustainable transport network used mainly by tourists. Exposure to wind, driving rain and repeated freeze-thaw cycles has resulted in severe spalling of some of the brickwork, particularly that from the 30m high piers. This paper describes the original construction, the rehabilitation works including the historical context of the structure, site inspections prior to and during construction and a review of the rehabilitation works taking into account factors such as differential movement and the need to achieve a high standard of workmanship
An experimental investigation of retro-reinforced clay brick arches
This paper describes the laboratory testing of eight 2.95m span segmental profile clay brick arches. Seven of the arches were strengthened with longitudinal intrados (soffit) reinforcement; the eighth was left unreinforced as an experimental control. Three of the arches also contained reinforcement to resist inter-ring shear. The barrel of each arch consisted of 3 rings of brickwork laid in stretcher bond; the compressive strength of the mortar used in the arch construction varied from 1.7 to 6.2 MPa. In each case a full width line load was applied incrementally to the arch extrados at quarter span until collapse occurred. Surface crack development and the vertical deflection profile of each arch were recorded at each load increment. In all cases, the longitudinal reinforcement was found to delay the onset of cracking and to increase the load carrying capacity. As expected, premature failure by ring separation was found to occur in the arches constructed with the weakest mortar without inter-ring reinforcement. Radial dowels were found to be the most effective means of preventing ring separation. The effect of the longitudinal reinforcement was found to be greatest in the arches where measures were taken to prevent ring separation
Sigurd Lewerentz: Church of St Peter, Klippan, 1963–66
This modest building questions basic assumptions about processes and finishes, about the nature of brickwork and the detailing of window frames – and provides a powerful space for worship
Unconditionally verifiable blind computation
Blind Quantum Computing (BQC) allows a client to have a server carry out a
quantum computation for them such that the client's input, output and
computation remain private. A desirable property for any BQC protocol is
verification, whereby the client can verify with high probability whether the
server has followed the instructions of the protocol, or if there has been some
deviation resulting in a corrupted output state. A verifiable BQC protocol can
be viewed as an interactive proof system leading to consequences for complexity
theory. The authors, together with Broadbent, previously proposed a universal
and unconditionally secure BQC scheme where the client only needs to be able to
prepare single qubits in separable states randomly chosen from a finite set and
send them to the server, who has the balance of the required quantum
computational resources. In this paper we extend that protocol with new
functionality allowing blind computational basis measurements, which we use to
construct a new verifiable BQC protocol based on a new class of resource
states. We rigorously prove that the probability of failing to detect an
incorrect output is exponentially small in a security parameter, while resource
overhead remains polynomial in this parameter. The new resource state allows
entangling gates to be performed between arbitrary pairs of logical qubits with
only constant overhead. This is a significant improvement on the original
scheme, which required that all computations to be performed must first be put
into a nearest neighbour form, incurring linear overhead in the number of
qubits. Such an improvement has important consequences for efficiency and
fault-tolerance thresholds.Comment: 46 pages, 10 figures. Additional protocol added which allows
arbitrary circuits to be verified with polynomial securit
Universal blind quantum computation
We present a protocol which allows a client to have a server carry out a
quantum computation for her such that the client's inputs, outputs and
computation remain perfectly private, and where she does not require any
quantum computational power or memory. The client only needs to be able to
prepare single qubits randomly chosen from a finite set and send them to the
server, who has the balance of the required quantum computational resources.
Our protocol is interactive: after the initial preparation of quantum states,
the client and server use two-way classical communication which enables the
client to drive the computation, giving single-qubit measurement instructions
to the server, depending on previous measurement outcomes. Our protocol works
for inputs and outputs that are either classical or quantum. We give an
authentication protocol that allows the client to detect an interfering server;
our scheme can also be made fault-tolerant.
We also generalize our result to the setting of a purely classical client who
communicates classically with two non-communicating entangled servers, in order
to perform a blind quantum computation. By incorporating the authentication
protocol, we show that any problem in BQP has an entangled two-prover
interactive proof with a purely classical verifier.
Our protocol is the first universal scheme which detects a cheating server,
as well as the first protocol which does not require any quantum computation
whatsoever on the client's side. The novelty of our approach is in using the
unique features of measurement-based quantum computing which allows us to
clearly distinguish between the quantum and classical aspects of a quantum
computation.Comment: 20 pages, 7 figures. This version contains detailed proofs of
authentication and fault tolerance. It also contains protocols for quantum
inputs and outputs and appendices not available in the published versio
A quantum algorithm for additive approximation of Ising partition functions
We investigate quantum computational complexity of calculating partition
functions of Ising models. We construct a quantum algorithm for an additive
approximation of Ising partition functions on square lattices. To this end, we
utilize the overlap mapping developed by Van den Nest, D\"ur, and Briegel
[Phys. Rev. Lett. 98, 117207 (2007)] and its interpretation through
measurement-based quantum computation (MBQC). We specify an algorithmic domain,
on which the proposed algorithm works, and an approximation scale, which
determines the accuracy of the approximation. We show that the proposed
algorithm does a nontrivial task, which would be intractable on any classical
computer, by showing the problem solvable by the proposed quantum algorithm are
BQP-complete. In the construction of the BQP-complete problem coupling
strengths and magnetic fields take complex values. However, the Ising models
that are of central interest in statistical physics and computer science
consist of real coupling strengths and magnetic fields. Thus we extend the
algorithmic domain of the proposed algorithm to such a real physical parameter
region and calculate the approximation scale explicitly. We found that the
overlap mapping and its MBQC interpretation improves the approximation scale
exponentially compared to a straightforward constant depth quantum algorithm.
On the other hand, the proposed quantum algorithm also provides us a partial
evidence that there exist no efficient classical algorithm for a multiplicative
approximation of the Ising partition functions even on the square lattice. This
result supports that the proposed quantum algorithm does a nontrivial task also
in the physical parameter region.Comment: 18 pages, 12 figure
Condensation risk: comparison of steady-state and transient methods
Accurate assessment of both surface and interstitial condensation risk is important not only to reduce the damaging effect of moisture within the structure of buildings, but also to provide a healthy environment free from mould growth. The current British Standard (BS EN ISO 13788: 2002) contains an assessment procedure based on the assumption of a steady-state heat flow through the building envelope, neglecting the transient nature of the problem. This paper compares and evaluates numerical results of the condensation risk calculation under both steady-state and transient conditions using the existing numerical codes. Significant differences are apparent between the predictions of the simple (steady-state) and complex (transient) methods for all construction details modelled
The complexity of simulating constant-depth BosonSampling
BosonSampling is a restricted model of quantum computation proposed recently,
where a non-adaptive linear-optical network is used to solve a sampling problem
that seems to be hard for classical computers. Here we show that, even if the
linear-optical network has a constant number (greater than four) of beam
splitter layers, the exact version of the BosonSampling problem is still
classically hard, unless the polynomial hierarchy collapses to its third level.
This is based on similar result known for constant-depth quantum circuits and
circuits of 2-local commuting gates (IQP).Comment: 9 pages, 3 figure
- …
