2,489 research outputs found

    The BIM process for the architectural heritage: New communication tools based on AR/VR Case study: Palazzo di CittĂ 

    Get PDF
    The present study aims at presenting the application of the Building Information Modeling methodology to the case study of Palazzo di CittĂ , the Turin City Hall, investigating the possibilities of integration of new technologies in Cultural Heritage preservation and valorization. From the survey phase to the communication of the CH to end-users, BIM methodology, combined with the latest digital innovations (AR, VR, 3d Laser Scanner and much more), allows a fast and highly communicative representation of buildings to both professionals and common visitors who interact with the building life-cycle. An important objective of this work is moreover to demonstrate the advantages of adopting and integrating this technologies in Real Estate Management at a national scale, fully testing the adaptability of parametric software and Virtual Reality modeling to complex and highly decorated buildings, confirming the potentiality of BIM software upon an uncommon field: the historic buildings. The case study is in fact Palazzo di CittĂ , the baroque, seventieth century City Hall of Turin. The research fully meets the latest directives of European Union and other International Organizations in the field of digitization of archives and Public Property management, participating to the international community effort to overcome the contemporary deep Construction Field crisis. In particular, the methodology has been focused and adapted to the protection and management of our huge Heritage, founding its objectives on the quest of cost-saving processes and instruments, applied to the management of a CH. Through BIM it is in fact possible to increase the communication and cooperation among all the actors involved in the building life-cycle behaving as a common working platform. Draws, 3D model and database are shared by all the actors and integrated in the same digital structure, where control tools and cooperation can prevent the designers from errors, saving time and money in the construction phase. The particularity of the case study, Palazzo di CittĂ , being contemporarily a CH, a public asset and a working space, allows a deep study of the possibilities of BIM applied to a complex building, touching very important aspects of a historic building management: digitization of the historic information, publication of modeling techniques of complex architectonical elements, transformations reconstruction, energy consumption control, Facility Management, dissemination, virtual reconstructions of the lost appearance and accessibility for people with sensory and motor impairments. Moreover, the last chapters of the study focus on the fruition of this paramount Turin CH, making available for all kind of people interesting and not well known aspects of the history of the building and of the city itself. This part of the research suggests a methodology to translate static 2d images and written descriptions of a CH into living and immersive VR environment, presenting in an interactive way the transformation of the Marble Hall, once called Aula Maior: the room where the Mayor meets his citizens. Besides the aspects related to the valorization and preservation of the CH, the study reserves considerable space to the deepening of technical aspects involving advanced parametric modeling techniques, use of BIM software and all the vital procedures necessary to the generation of an efficient management informative platform. The whole work is intended as a guide for future works, structuring a replicable protocol to achieve an efficient digitization of papery resources into a 3d virtual model

    Design of Low Impact Development and Green Infrastructure at Flood Prone Areas in the City of Miami Beach, FLORIDA, USA

    Get PDF
    This thesis investigates the effectiveness of Low Impact Development Infrastructure (LIDI) and Green Infrastructure (GI) in reducing flooding resulting from heavy rainfall events and sea-level rise, and in improving stormwater quality in the City of Miami Beach (CMB). InfoSWMM was used to simulate the 5, 10, and 100-year, 24-hour storm events, total suspended solids (TSS), biochemical oxygen demand (BOD), and chemical oxygen demand (COD) loadings, and in evaluating the potential of selected LIDI and GI solutions in North Shore neighborhood. Post-development results revealed a decrease of 48%, 46%, and 39% in runoff, a decrease of 57%, 60%, and 62% in TSS, a decrease of 82%, 82%, and 84% in BOD, and a decrease of 69%, 69%, and 70% in COD loadings. SWMM 5.1 was also used to simulate the king tide effect in a cross section in Indian Creek Drive. The proposed design simulations successfully demonstrated the potential to control flooding, showing that innovative technologies offer the city opportunities to cope with climate impacts. This study should be most helpful to the CMB to support its management of flooding under any adaptation scenarios that may possibly result from climate changes. Flooding could be again caused as a result of changes in inland flooding from precipitation patterns or from sea-level rise or both

    Using species A rotavirus reverse genetics to engineer chimeric viruses expressing SARS-CoV-2 spike epitopes:Heterologous viral peptide expression by rotavirus A

    Get PDF
    Species A rotavirus (RVA) vaccines based on live attenuated viruses are used worldwide in humans. The recent establishment of a reverse genetics system for rotoviruses (RVs) has opened the possibility of engineering chimeric viruses expressing heterologous peptides from other viral or microbial species in order to develop polyvalent vaccines. We tested the feasibility of this concept by two approaches. First, we inserted short SARS-CoV-2 spike peptides into the hypervariable region of the simian RV SA11 strain viral protein (VP) 4. Second, we fused the receptor binding domain (RBD) of the SARS-CoV-2 spike protein, or the shorter receptor binding motif (RBM) nested within the RBD, to the C terminus of nonstructural protein (NSP) 3 of the bovine RV RF strain, with or without an intervening Thosea asigna virus 2A (T2A) peptide. Mutating the hypervariable region of SA11 VP4 impeded viral replication, and for these mutants, no cross-reactivity with spike antibodies was detected. To rescue NSP3 mutants, we established a plasmid-based reverse genetics system for the bovine RV RF strain. Except for the RBD mutant that demonstrated a rescue defect, all NSP3 mutants delivered endpoint infectivity titers and exhibited replication kinetics comparable to that of the wild-type virus. In ELISAs, cell lysates of an NSP3 mutant expressing the RBD peptide showed cross-reactivity with a SARS-CoV-2 RBD antibody. 3D bovine gut enteroids were susceptible to infection by all NSP3 mutants, but cross-reactivity with SARS-CoV-2 RBD antibody was only detected for the RBM mutant. The tolerance of large SARS-CoV-2 peptide insertions at the C terminus of NSP3 in the presence of T2A element highlights the potential of this approach for the development of vaccine vectors targeting multiple enteric pathogens simultaneously. IMPORTANCE We explored the use of rotaviruses (RVs) to express heterologous peptides, using SARS-CoV-2 as an example. Small SARS-CoV-2 peptide insertions (<34 amino acids) into the hypervariable region of the viral protein 4 (VP4) of RV SA11 strain resulted in reduced viral titer and replication, demonstrating a limited tolerance for peptide insertions at this site. To test the RV RF strain for its tolerance for peptide insertions, we constructed a reverse genetics system. NSP3 was C-terminally tagged with SARS-CoV-2 spike peptides of up to 193 amino acids in length. With a T2A-separated 193 amino acid tag on NSP3, there was no significant effect on the viral rescue efficiency, endpoint titer, and replication kinetics. Tagged NSP3 elicited cross-reactivity with SARS-CoV-2 spike antibodies in ELISA. We highlight the potential for development of RV vaccine vectors targeting multiple enteric pathogens simultaneously

    Measuring the Capital Energy Value in Historic Structures

    Get PDF
    A credible model to account for the overall energy benefits with retention of historic buildings has been needed since preservation became national policy in 1966. The initial need to measure energy capital in buildings arose from the two energy crises in the 1970s, with a second need to address the sustainability goals of the 1990s/2000s. Both responses measure overall energy efficiency of historic buildings by attempting to account for the energy capital. The Advisory Council on Historic Preservation introduced the first model in 1979, focused on measuring embodied energy and it has remained embedded in preservation vocabulary and is a reflexive argument utilized to advocate for the retention of historic structures over new construction. The second model, the life cycle assessment/avoided impacts is a response to the evolving metrics and currency of sustainability. The Preservation Green Lab further matured the capabilities of the life cycle assessment/avoided impacts model in 2012 with their innovative report, The Greenest Building: Quantifying the Environmental Value of Building Reuse. This thesis evaluates the future of the preservation field to communicate with a common currency regarding retention of historic structures

    Distributed computing and farm management with application to the search for heavy gauge bosons using the ATLAS experiment at the LHC (CERN)

    Get PDF
    The Standard Model of particle physics describes the strong, weak, and electromagnetic forces between the fundamental particles of ordinary matter. However, it presents several problems and some questions remain unanswered so it cannot be considered a complete theory of fundamental interactions. Many extensions have been proposed in order to address these problems. Some important recent extensions are the Extra Dimensions theories. In the context of some models with Extra Dimensions of size about 1TeV−11 TeV^{-}1, in particular in the ADD model with only fermions confined to a D-brane, heavy Kaluza-Klein excitations are expected, with the same properties as SM gauge bosons but more massive. In this work, three hadronic decay modes of some of such massive gauge bosons, Z* and W*, are investigated using the ATLAS experiment at the Large Hadron Collider (LHC), presently under construction at CERN. These hadronic modes are more difficult to detect than the leptonic ones, but they should allow a measurement of the couplings between heavy gauge bosons and quarks. The events were generated using the ATLAS fast simulation and reconstruction MC program Atlfast coupled to the Monte Carlo generator PYTHIA. We found that for an integrated luminosity of 3×105pb−13 × 10^{5} pb^{-}1 and a heavy gauge boson mass of 2 TeV, the channels Z*-&gt;bb and Z*-&gt;tt would be difficult to detect because the signal would be very small compared with the expected backgrou nd, although the significance in the case of Z*-&gt;tt is larger. In the channel W*-&gt;tb , the decay might yield a signal separable from the background and a significance larger than 5 so we conclude that it would be possible to detect this particular mode at the LHC. The analysis was also performed for masses of 1 TeV and we conclude that the observability decreases with the mass. In particular, a significance higher than 5 may be achieved below approximately 1.4, 1.9 and 2.2 TeV for Z*-&gt;bb , Z*-&gt;tt and W*-&gt;tb respectively. The LHC will start to operate in 2008 and collect data in 2009. It will produce roughly 15 Petabytes of data per year. Access to this experimental data has to be provided for some 5,000 scientists working in 500 research institutes and universities. In addition, all data need to be available over the estimated 15-year lifetime of the LHC. The analysis of the data, including comparison with theoretical simulations, requires an enormous computing power. The computing challenges that scientists have to face are the huge amount of data, calculations to perform and collaborators. The Grid has been proposed as a solution for those challenges. The LHC Computing Grid project (LCG) is the Grid used by ATLAS and the other LHC experiments and it is analised in depth with the aim of studying the possible complementary use of it with another Grid project. That is the Berkeley Open Infrastructure for Network C omputing middle-ware (BOINC) developed for the SETI@home project, a Grid specialised in high CPU requirements and in using volunteer computing resources. Several important packages of physics software used by ATLAS and other LHC experiments have been successfully adapted/ported to be used with this platform with the aim of integrating them into the LHC@home project at CERN: Atlfast, PYTHIA, Geant4 and Garfield. The events used in our physics analysis with Atlfast were reproduced using BOINC obtaining exactly the same results. The LCG software, in particular SEAL, ROOT and the external software, was ported to the Solaris/sparc platform to study it's portability in general as well. A testbed was performed including a big number of heterogeneous hardware and software that involves a farm of 100 computers at CERN's computing center (lxboinc) together with 30 PCs from CIEMAT and 45 from schools from Extremadura (Spain). That required a preliminary study, development and creation of components of the Quattor software and configuration management tool to install and manage the lxboinc farm and it also involved the set up of a collaboration between the Spanish research centers and government and CERN. The testbed was successful and 26,597 Grid jobs were delivered, executed and received successfully. We conclude that BOINC and LCG are complementary and useful kinds of Grid that can be used by ATLAS and the other LHC experiments. LCG has very good data distribution, management and storage capabilities that BOINC does not have. In the other hand, BOINC does not need high bandwidth or Internet speed and it also can provide a huge and inexpensive amount of computing power coming from volunteers. In addition, it is possible to send jobs from LCG to BOINC and vice versa. So, possible complementary cases are to use volunteer BOINC nodes when the LCG nodes have too many jobs to do or to use BOINC for high CPU tasks like event generators or reconstructions while concentrating LCG for data analysis

    JUNO Conceptual Design Report

    Get PDF
    The Jiangmen Underground Neutrino Observatory (JUNO) is proposed to determine the neutrino mass hierarchy using an underground liquid scintillator detector. It is located 53 km away from both Yangjiang and Taishan Nuclear Power Plants in Guangdong, China. The experimental hall, spanning more than 50 meters, is under a granite mountain of over 700 m overburden. Within six years of running, the detection of reactor antineutrinos can resolve the neutrino mass hierarchy at a confidence level of 3-4σ\sigma, and determine neutrino oscillation parameters sin⁥2Ξ12\sin^2\theta_{12}, Δm212\Delta m^2_{21}, and ∣Δmee2∣|\Delta m^2_{ee}| to an accuracy of better than 1%. The JUNO detector can be also used to study terrestrial and extra-terrestrial neutrinos and new physics beyond the Standard Model. The central detector contains 20,000 tons liquid scintillator with an acrylic sphere of 35 m in diameter. ∌\sim17,000 508-mm diameter PMTs with high quantum efficiency provide ∌\sim75% optical coverage. The current choice of the liquid scintillator is: linear alkyl benzene (LAB) as the solvent, plus PPO as the scintillation fluor and a wavelength-shifter (Bis-MSB). The number of detected photoelectrons per MeV is larger than 1,100 and the energy resolution is expected to be 3% at 1 MeV. The calibration system is designed to deploy multiple sources to cover the entire energy range of reactor antineutrinos, and to achieve a full-volume position coverage inside the detector. The veto system is used for muon detection, muon induced background study and reduction. It consists of a Water Cherenkov detector and a Top Tracker system. The readout system, the detector control system and the offline system insure efficient and stable data acquisition and processing.Comment: 328 pages, 211 figure

    Design, implementation, and performance of a distributed and scalable sensor system for critical distance measurements in the CMS detector at LHC

    Get PDF
    The “CMS Safety Closing Sensors System” (SCSS, or CSS for brevity) is a remote monitoring system design to control safety clearance and tight mechanical movements of parts of the CMS detector, especially during CMS assembly phases. We present the different systems that makes SCSS: its sensor technologies, the readout system, the data acquisition and control software. We also report on calibration and installation details, which determine the resolution and limits of the system. We present as well our experience from the operation of the system and the analysis of the data collected since 2008. Special emphasis is given to study positioning reproducibility during detector assembly and understanding how the magnetic fields influence the detector structure

    The Muqarnas in contemporary art and epigraphic design: developing technical vision in the design of the muqarnas

    Get PDF
    This thesis is about muqarnas, a type of three-dimensional decorative finishing strip using concave elements. It is a kind of corbel used in Islamic architecture both as a decorative motif and as an enriched block or horizontal bracket. It is generally found under the cornice and above the bed -mould of the Corinthian entablature, and at the same time, it hides the transitional zones between various surfaces (e.g. arches, domes, capitals, windows, ceilings, minarets, mihrabs, minbar, façade). It can take a number of forms and in some circumstances resembles stalactites. It has been applied, artistically, to different materials (e.g. stucco, stone, marble, wood, faience and polychrome) in unique, regularly spaced, geometric arrangements.The muqarnas is an important feature in Islamic architecture because of its social, cultural and symbolic meanings. The research aim is to critically analyse the muqarnas and to shed light on its genesis, nature and evolution. This will be followed by an attempt to transform the muqarnas to suit modern use without losing its meaning. This study will highlight the importance of providing a simple software program for modelling the muqarnas, relevant to the field of Islamic architecture, epigraphic design and art such that it can be appreciated by contemporary practitioners, especially contemporary viewers, who will find different options in the model (muqarnas blocks) that will allow them to assess alternative designs and have them ready for use in the form of computerised two -dimensional and three-dimensional drawings.The thesis begins with a first chapter comprising an introduction to the background, aim, objectives, methodology and the significance of the research. The second chapter is a review of the history of muqarnas and offers an interpretation of all the figures that combine to make the muqarnas types, spatial compositions in interlocking values. The chapter also explores the cultural and compositional units (cube, sphere, wall, columns and arches) and the properties of the organic rules of the muqarnas. The third chapter is an analysis of the proportional order and harmony of each element of the muqarnas units in Islamic architecture. The fourth chapter puts an intellectual and subjective perspective on the properties of the muqarnas, concentrating on structural transformation in Islamic art and architecture using structuralism and associated theories. The fifth chapter reviews the performance of the muqarnas design processing program `Generator of Mugarnas'. This program can be used to visualise data generated from the blocks of muqarnas, to create a user interface and to convert two-dimensional plans into three- dimensional muqarnas data. The program is based on the original muqarnas types and allows for efficiency of working with materials, textures, colours and light. The final chapter concludes with a brief overview of the significance of the study.This innovative approach to the modern world will introduce the aesthetics of the muqarnas to a new audience, and rekindle the interest of designers, artists and architects. Using the program they will find alternatives ready for use in the form of computer -generated muqarnas drawings which will help them, as they are easy to use, saving time and effort. The author has made contact with professionals who are interested in using muqamas and those who are looking to invest and publish the software program when it has been fully developed and tested
    • 

    corecore