3,149 research outputs found

    Spatial optimization for land use allocation: accounting for sustainability concerns

    Get PDF
    Land-use allocation has long been an important area of research in regional science. Land-use patterns are fundamental to the functions of the biosphere, creating interactions that have substantial impacts on the environment. The spatial arrangement of land uses therefore has implications for activity and travel within a region. Balancing development, economic growth, social interaction, and the protection of the natural environment is at the heart of long-term sustainability. Since land-use patterns are spatially explicit in nature, planning and management necessarily must integrate geographical information system and spatial optimization in meaningful ways if efficiency goals and objectives are to be achieved. This article reviews spatial optimization approaches that have been relied upon to support land-use planning. Characteristics of sustainable land use, particularly compactness, contiguity, and compatibility, are discussed and how spatial optimization techniques have addressed these characteristics are detailed. In particular, objectives and constraints in spatial optimization approaches are examined

    Design, construction and commissioning of the Thermal Screen Control System for the CMS Tracker detector at CERN

    Get PDF
    The CERN (European Organization for Nuclear Research) laboratory is currently building the Large Hadron Collider (LHC). Four international collaborations have designed (and are now constructing) detectors able to exploit the physics potential of this collider. Among them is the Compact Muon Solenoid (CMS), a general purpose detector optimized for the search of Higgs boson and for physics beyond the Standard Model of fundamental interactions between elementary particles. This thesis presents, in particular, the design, construction, commissioning and test of the control system for a screen that provides a thermal separation between the Tracker and ECAL (Electromagnetic CALorimeter) detector of CMS (Compact Muon Solenoid experiment). Chapter 1 introduces the new challenges posed by these installations and deals, more in detail, with the Tracker detector of CMS. The size of current experiments for high energy physics is comparable to that of a small industrial plant: therefore, the techniques used for controls and regulations, although highly customized, must adopt Commercial Off The Shelf (COTS) hardare and software. The âワslow controlâ systems for the experiments at CERN make extensive use of PLCs (Programmable Logic Controllers) and SCADA (Supervisory Control and Data Acquisition) to provide safety levels (namely interlocks), regulations, remote control of high and low voltages distributions, as well as archiving and trending facilities. The system described in this thesis must follow the same philosophy and, at the same time, comply with international engineering standards. While the interlocks applications belong straightforwardly to the category of DES (Discrete Event System), and are therefore treated with a Finite State Machine approach, other controls are more strictly related to the regulation problem. Chapter 2 will focus on various aspects of modern process control and on the tools used to design the control system for the thermal screen: the principles upon which the controller is designed and tuned, and the model validated, including the Multiple Input-Multiple Output (MIMO) problematics are explained. The thermal screen itself, the constraints and the basis of its functioning are described in Chapter 3, where the thermodynamical design is discussed as well. For the LHC experiments, the aim of a control system is also to provide a well defined SIL (Safety Interlock Level) to keep the system in a safe condition; yet, in this case, it is necessary to regulate the temperature of the system within certain values and respect the constraints arising from the specific needs of the above mentioned subsystems. The most natural choice for a PLC-based controller is a PID (Proportional Integral Derivative) controller. This kind of controller is widely used in many industrial process, from batch production in the pharmaceutics or automotive field to chemical plants, distillation columns and, in general, wherever a reliable and robust control is needed. In order to design and tune PID controllers, many techniques are in use; the approach followed in this thesis is that of black-box modeling: the system is modeled in the time domain, a transfer function is inferred and a controller is designed. Then, a system identification procedure allows for a more thorough study and validation of the model, and for the controller tuning. Project of the thermal screen control including system modeling, controller design and MIMO implementation issues are entirely covered in Chapter 4. A systems engineering methodology has been followed all along to adequately manage and document every phase of the project, complying with time and budget constraints. A risk analysis has been performed, using Layer of Protection Analysis (LOPA) and Hazard and Operability Studies (HAZOP), to understand the level of protection assured by the thermal screen and its control components. Tests planned and then performed to validate the model and for quality assurance purposes are described in Chapter 5. A climatic chamber has been designed and built at CERN, where the real operating conditions of the thermal screen are simulated. Detailed test procedures have been defined, following IEEE standards, in order to completely check every single thermal screen panel. This installation allows for a comparison of different controller tuning approaches, including IAE minimization, Skogestad tuning rules, Internal Model Control (IMC), and a technique based upon the MatLab Optimization toolbox. This installation is also used for system identification purposes and for the acceptance tests of every thermal screen panel (allowing for both electrical and hydraulic checks). Also, tests have been performed on the West Hall CERN experimental area , where a full control system has been set up, for interlock high- and low- voltage lines. The interlock system operating procedures and behaviour have been validated during real operating conditions of the detector esposed to a particle beam. The satisfactory results of tests take the project to full completion, allowing the plan to reach the âワexitâ stage, when the thermal screen is ready to be installed in the Tracker and ready to be operational

    Simulation-Based Evaluation and Optimization of Control Strategies in Buildings

    Get PDF
    Over the last several years, a great amount of research work has been focused on the development of model predictive control techniques for the indoor climate control of buildings, but, despite the promising results, this technology is still not adopted by the industry. One of the main reasons for this is the increased cost associated with the development and calibration (or identification) of mathematical models of special structure used for predicting future states of the building. We propose a methodology to overcome this obstacle by replacing these hand-engineered mathematical models with a thermal simulation model of the building developed using detailed thermal simulation engines such as EnergyPlus. As designing better controllers requires interacting with the simulation model, a central part of our methodology is the control improvement (or optimisation) module, facilitating two simulation-based control improvement methodologies: one based in multi-criteria decision analysis methods and the other based on state-space identification of dynamical systems using Gaussian process models and reinforcement learning. We evaluate the proposed methodology in a set of simulation-based experiments using the thermal simulation model of a real building located in Portugal. Our results indicate that the proposed methodology could be a viable alternative to model predictive control-based supervisory control in buildings.Research leading to these results has been partially supported by the Modelling Optimization of Energy Efficiency in Buildings for Urban Sustainability (MOEEBIUS) project. This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 680517. Georgios Giannakis and Dimitrios Rovas gratefully acknowledge financial support from the European Commission H2020-EeB5-2015 project “Optimised Energy Efficient Design Platform for Refurbishment at District Level” under Contract #680676 (OptEEmAL). Georgios Kontes and Christopher Mutschler gratefully acknowledge financial support from the Federal Ministry of Education and Research of Germany in the framework of Machine Learning Forum (grant number 01IS17071). Georgios Kontes, Natalia Panagiotidou, Simone Steiger and Gunnar Gruen gratefully acknowledge use of the services and facilities of the Energie Campus Nürnberg. The APC was funded by MOEEBIUS project. This paper reflects only the authors’ views and the Commission is not responsible for any use that may be made of the information contained therein

    Performance Evaluation of Parallel Haemodynamic Computations on Heterogeneous Clouds

    Get PDF
    The article presents performance evaluation of parallel haemodynamic flow computations on heterogeneous resources of the OpenStack cloud infrastructure. The main focus is on the parallel performance analysis, energy consumption and virtualization overhead of the developed software service based on ANSYS Fluent platform which runs on Docker containers of the private university cloud. The haemodynamic aortic valve flow described by incompressible Navier-Stokes equations is considered as a target application of the hosted cloud infrastructure. The parallel performance of the developed software service is assessed measuring the parallel speedup of computations carried out on virtualized heterogeneous resources. The performance measured on Docker containers is compared with that obtained by using the native hardware. The alternative solution algorithms are explored in terms of the parallel performance and power consumption. The investigation of a trade-off between the computing speed and the consumed energy is performed by using Pareto front analysis and a linear scalarization method

    GiD 2008. 4th Conference on advances and applications of GiD

    Get PDF
    The extended use of simulation programs has leaned on the advances in user-friendly interfaces and in the capability to generate meshes for any generic complex geometry. More than ten years of development have made Gid grow to become one of the more popular pre ans postprocessing systems at international level. The constant dialogue between the GiD development team and the users has guided the development of giD to cover the pre-post needs of many disciplines in science and engineering. Following gthis philosophy, the biannual GiD Conference has become an important forum for discussion and interchange of experiences among the GiD community. This monograph includes the contributions of the participants to the fourth edition of the GiD Conference held in the island of Ibiza from 8-9 May 2008

    Bioinformatics challenges for genome-wide association studies

    Get PDF
    Motivation: The sequencing of the human genome has made it possible to identify an informative set of >1 million single nucleotide polymorphisms (SNPs) across the genome that can be used to carry out genome-wide association studies (GWASs). The availability of massive amounts of GWAS data has necessitated the development of new biostatistical methods for quality control, imputation and analysis issues including multiple testing. This work has been successful and has enabled the discovery of new associations that have been replicated in multiple studies. However, it is now recognized that most SNPs discovered via GWAS have small effects on disease susceptibility and thus may not be suitable for improving health care through genetic testing. One likely explanation for the mixed results of GWAS is that the current biostatistical analysis paradigm is by design agnostic or unbiased in that it ignores all prior knowledge about disease pathobiology. Further, the linear modeling framework that is employed in GWAS often considers only one SNP at a time thus ignoring their genomic and environmental context. There is now a shift away from the biostatistical approach toward a more holistic approach that recognizes the complexity of the genotype–phenotype relationship that is characterized by significant heterogeneity and gene–gene and gene–environment interaction. We argue here that bioinformatics has an important role to play in addressing the complexity of the underlying genetic basis of common human diseases. The goal of this review is to identify and discuss those GWAS challenges that will require computational methods
    corecore