3,492 research outputs found
ARCHITECTURAL HYPER-MODEL: CHANGING ARCHITECTURAL CONSTRUCTION DOCUMENTATION
More architects and spatial designers are producing complex 3D computer models as part of
their everyday design process and documentation than ever before. Parallel to this shift, there
has been a rapid rise in consumer computer processing power that has made hyper realistic
digital environments a part of our home entertainment. Together, the 3D CAD models and the
Computer Gaming Engine could become an architectural hyper-model that renders a digital
environment in real time. Such a model would enable users to navigate freely, effectively
establishing a new mode of reading space that hovers between 2D drawings and a real
space.(Nitsche & Roudavski). This paper will examine how these worlds can merge to form an
architectural hyper-model as a valuable supplement to the more conventional scaled 2D
construction drawing documentation found on construction sites.
While easily misconstrued as speculative, the ideas presented in this paper outline an on-going
body of innovative research currently at the prototype stage. These prototypical hyper-models
explore the possibilities of providing construction workers and project mangers access to an
architect’s 3D computer models on site. These models originate from within conventional
building construction drawings such as detailed sections and exploded axonometrics. A process
of reinterpretation occurs to locate these drawings and their information within an interactive 3D
space. Such operations take advantage of the best of both paradigms. This gives users access
to, and control of, the 3D information required for communicating necessary information about
the building process. It also provides nodes or hyper-links in the 3D representation that connect
to additional information, such as specifications, that are perhaps less formal/spatial.
The paper will show how architectural hyper-models can be used on the construction site - both
in the site office and on site using laptop computers and more compact hand-held devices - to
decrease on site confusion and enable a faster and more complete understanding of the
architect’s vision. The paper concludes with speculation on the types of additional information
construction workers, architects and designers might want to access in the future and proposes
additional technologies that could be provided
Improving the Efficiency of Quantum Circuits for Information Set Decoding
The NIST Post-Quantum standardization initiative, that entered its fourth round, aims to select asymmetric cryptosystems secure against attacker equipped with a quantum computer. Code-based cryptosystems are a promising option for Post-Quantum Cryptography (PQC), as neither classical nor quantum algorithms provide polynomial time solvers for its underlying hard problems. Indeed, to provide sound alternatives to lattice-based cryptosystems, NIST advanced all round 3 code-based cryptosystems to round 4. We present a complete implementation of a quantum circuit based on the Information Set Decoding (ISD) strategy, the best known one against code-based cryptosystems, providing quantitative measures for the security margin achieved with respect to the quantum-accelerated key recovery on AES, targeting both the current state-of-the-art approach and the NIST estimates. Our work improves the state-of-the-art, reducing the circuit depth from 2¹⁹ to 2³⁰ for all the parameters of the NIST selected cryptosystems. We further analyse recently proposed optimizations, showing that the overhead introduced by their implementation overcomes their asymptotic advantages. Finally, we address the concern brought forward in the latest NIST report on the parameters choice for the McEliece cryptosystem, showing that the parameter choice yields a computational effort which is slightly below the required target level
Post-Traumatic Outcomes among Survivors of the Earthquake in Central Italy of August 24, 2016. A Study on PTSD Risk and Vulnerability Factors
Central Italy suffered from the earthquake of 2016 resulting in great damage to the community. The purpose of the present study was to determine the long-term traumatic outcomes among the population. A preliminary study aimed at obtaining the Italian translation of the first 16 item of HTQ IV part [1] which was administered, 20 months after the disaster, at 281 survivors. In backward stepwise logistic regressions models, we estimated among the respondent’s characteristics and event-related variables the best predictors of Post-Traumatic Stress Disorder (PTSD). A Confirmatory Factor Analysis (CFA) revealed a HTQ five-factors solution as best model, with satisfactory indexes of fit. HTQ held a positive correlation with both the SQD-P (r =.65, p <.05) and SQD-D subscales (r =.47, p <.05). ROC analysis suggested an area of.951 (95% CI =.917–.985) for the PTSD prediction. Basing on sensibility (.963) and specificity (.189), the best cut-off of 2.0 allowed discriminating for PTSD positive cases. After 20 months of the earthquake, the estimate prevalence of PTSD among the survivors is of 21.71% with a consistent and graded association between exposure variables and vulnerability factors (gender, age, exposure to death and home damage) and PTSD symptoms
Status of Salerno Laboratory (Measurements in Nuclear Emulsion)
A report on the analysis work in the Salerno Emulsion Laboratory is
presented. It is related to the search for nu_mu->nu_tau oscillations in CHORUS
experiment, the calibrations in the WANF (West Area Neutrino Facility) at Cern
and tests and preparation for new experiments.Comment: Proc. The First International Workshop of Nuclear Emulsion Techniques
(12-24 June 1998, Nagoya, Japan), 15 pages, 11 figure
- …