15 research outputs found

    An Application Perspective on High-Performance Computing and Communications

    Get PDF
    We review possible and probable industrial applications of HPCC focusing on the software and hardware issues. Thirty-three separate categories are illustrated by detailed descriptions of five areas -- computational chemistry; Monte Carlo methods from physics to economics; manufacturing; and computational fluid dynamics; command and control; or crisis management; and multimedia services to client computers and settop boxes. The hardware varies from tightly-coupled parallel supercomputers to heterogeneous distributed systems. The software models span HPF and data parallelism, to distributed information systems and object/data flow parallelism on the Web. We find that in each case, it is reasonably clear that HPCC works in principle, and postulate that this knowledge can be used in a new generation of software infrastructure based on the WebWindows approach, and discussed in an accompanying paper

    Improve the Low Energy Sensitivity of the HAWC Observatory

    Get PDF
    The high altitude water cherenkov gamma-ray observatory (HAWC) has been fully operational since March of 2015 in Mexico at 4,100 meters above sea level on the hillside of the Sierra Negra Volcano. It consists of an array of 300 water cherenkov detectors, each equipped with four photo-multiplier tubes. HAWC operates 24-hours per day with a wide field-of-view (FOV, ∼ 2 sr) and a high duty cycle (∼ 95%). These make it a powerful survey and monitoring experiment for mapping the gamma ray sky at very high energies (VHE, 100 GeV to 100 TeV) and to study sources with varying intensities. Thus HAWC is well suited to detect gamma-ray counterparts of possible flaring sources seen in neutrino events observed by IceCube or gravitational wave events observed by LIGO/Virgo. Extra-galactic sources including active galactic nuclei and gamma ray bursts are characterized by power-law spectra with most of the observed photon flux at 1 Tev and below. This corresponds to the lower energy range for HAWC. To participate in this science it is essential to optimize HAWC’s performance for gamma rays below ∼ 1 TeV. This is a particular challenge as in HAWC gamma rays below ∼ 1 TeV have a low signal-to-noise ratio, the events have limited and incomplete information and the HAWC Monte Carlo simulation does not well model all aspects of the events. Fortunately HAWC data includes a well characterized gamma ray source: the Crab nebula. Thus we use the significance level and the angular resolution of the Crab to quantify our gamma ray detection sensitivity improvements. Two critical factors are involved: the interpretation of HAWC raw detector signals (referred to as data reconstruction) and the rejection of cosmic ray background (referred to as gamma hadron separation). While this thesis focuses on different optimizations for (hadron) background rejection, both factors are addressed. An example of applying one improved analysis on searches for nearby AGNs is presented

    Development of a prototype for the automated generation of timetable scenarios specified by the transport service intention

    Get PDF
    Within the next 5 to 10 years, public transport in Switzerland as well as in other European countries will experience major technological and organisational changes. However, changes will also take place on the customer side, resulting in different mobility behaviour and demand patterns. These changes will lead to additional challenges for transport service providers in private as well as public domains. Time to market will be a key success factor and it is unnecessary to mention that due to these factors the speed and flexibility of business processes in freight as well as in passenger transport industry have to be increased significantly. Within the railway value chain (line planning, timetabling and vehicle scheduling etc.) the coordination of the individual planning steps is a key success factor. SBB as the leading service provider in public transport in Switzerland has recognized this challenge and, together with various partners, initiated the strategic project Smart Rail 4.0. The ZHAW and especially the Institute for Data Analysis and Process Design (IDP) of the School of Engineering wants to be part of this transformation process and to contribute with research and educational activities. The IDP research therefore aims for the transformation of academic and scientific know-how to practical applicability. In a first step this concerns directly the current Smart Rail 4.0 TMS-PAS project activities, that concentrate on timetabling issues. The IDP project team considers the integration of the line planning and the timetabling process as crucial for practical applications. To address this in the current research project, we present an application concept that enables the integration of these two major process steps in the transport service value-chain. Although it turns out from our research, that the technical requirements for the integration of the process can be satisfied, rules and conditions for a closer cooperation of the involved business units, the train operating companies and the infrastructure operating company, have to be improved and to be worked out in more detail. In addition to a detailed application concept with use cases for the timetabling process we propose a methodology for computer aided timetable generation based on the central planning object known as ‘service intention’. The service intention can be used to iteratively develop the timetable relying on a ‘progressive feasibility assessment’, a feature that is requested in practice. Our proposed model is based on the ‘track-choice’ and line rotation extension of the commonly known method for the generation of periodic event schedules ‘PESP’. The extension makes use of the track infrastructure representation which is also used in the line planning and timetabling system Viriato. This system that is widely used by public transport planners and operators. With the help of Viriato, it is rather easy to configure the timetabling problem in sufficient detail. On the other side, the level of detail of the considered data is light enough to algorithmically solve practical timetabling problems of realistic sizes. Taking into consideration the technical and operational constraints given by rolling stock, station and track topology data on one hand, and the commercial requirements defined by a given line concept on the other, the method presented generates periodic timetables including train-track assignments. In the first step, the standardized data structure ‘service intention’ represents the line concept consisting of train paths and frequencies. Due to the utilization of infrastructure-based track capacities, we are also able to assess the feasibility of the line concept given. Additionally, the method allows for handling temporary resource restrictions (e.g. caused by construction sites or operational disturbances). In order to assess the performance of the resulting timetable we present a framework for performance measurement that addresses the customer convenience (in terms of start-to-end travel time) as well as operational stability requirements (in terms of delay sensitivity and critical relations)

    Tsunamis from source to coast

    Get PDF
    Tsunami disasters pose a significant threat to coastal communities. In the last decades, tsunamis caused enormous destruction and exceeding 250000 fatalities. International efforts led to sig-nificant advances in tsunami science and research, but recent events demonstrated some limi-tations. Thus, it is essential to increase our knowledge of the source to coast tsunami phenom-enon. A better understanding of potential tectonic structures and other generation mechanisms is needed, especially in complex geologic domains or where sources are unknown. Furthermore, we need to improve Tsunami Warning Systems (TWSs) to provide timely alerts for communi-ties in the near field. Therefore, potential tsunamigenic sources in the diffuse plate boundary setting and the near field of the southwest Iberian margin (SWIM) are investigated. For the March 31, 1761, trans-atlantic tsunami, numerical modelling has been used to propose a structure that agrees with tsunami travel times, tsunami observations, macroseismic data, and kinematic plate modelling. Since there exists a description of a tsunami for the November 11, 1858, Sétubal earthquake, its source has been investigated using macroseismic analysis. The analysis suggests a local structure in a compressive regime with weak to moderate tsunamigenic potential. Future tsu-nami hazard assessments need to include the sources of the investigated events. To quickly estimate the tsunami impact, the Tsunami Runup Predictor (TRP), an empirical source-to-coast method to instantly provide first-order estimates of the tsunami runup based on waveform parameters has been developed. The TRP is helpful for emergency managers and evacuation planning for near-field events. Moreover, the author of this thesis contributed to the tsunami impact assessment of September 28, 2018, Palu tsunami, where tsunamis generated by multiple sources caused runup heights up to 9.2 m. However, for local sources, tsunami warning remains challenging; thus, communities need to be prepared how to respond appropriately to earthquakes and tsunamis with or without warning

    Европейский и национальный контексты в научных исследованиях

    Get PDF
    В настоящем электронном сборнике «Европейский и национальный контексты в научных исследованиях. Технология» представлены работы молодых ученых по геодезии и картографии, химической технологии и машиностроению, информационным технологиям, строительству и радиотехнике. Предназначены для работников образования, науки и производства. Будут полезны студентам, магистрантам и аспирантам университетов.=In this Electronic collected materials “National and European dimension in research. Technology” works in the fields of geodesy, chemical technology, mechanical engineering, information technology, civil engineering, and radio-engineering are presented. It is intended for trainers, researchers and professionals. It can be useful for university graduate and post-graduate students

    Safety system design optimisation

    Get PDF
    This thesis investigates the efficiency of a design optimisation scheme that is appropriate for systems which require a high likelihood of functioning on demand. Traditional approaches to the design of safety critical systems follow the preliminary design, analysis, appraisal and redesign stages until what is regarded as an acceptable design is achieved. For safety systems whose failure could result in loss of life it is imperative that the best use of the available resources is made and a system which is optimal, not just adequate, is produced. The object of the design optimisation problem is to minimise system unavailability through manipulation of the design variables, such that limitations placed on them by constraints are not violated. Commonly, with mathematical optimisation problem; there will be an explicit objective function which defines how the characteristic to be minimised is related to the variables. As regards the safety system problem, an explicit objective function cannot be formulated, and as such, system performance is assessed using the fault tree method. By the use of house events a single fault tree is constructed to represent the failure causes of each potential design to overcome the time consuming task of constructing a fault tree for each design investigated during the optimisation procedure. Once the fault tree has been constructed for the design in question it is converted to a BDD for analysis. A genetic algorithm is first employed to perform the system optimisation, where the practicality of this approach is demonstrated initially through application to a High-Integrity Protection System (HIPS) and subsequently a more complex Firewater Deluge System (FDS). An alternative optimisation scheme achieves the final design specification by solving a sequence of optimisation problems. Each of these problems are defined by assuming some form of the objective function and specifying a sub-region of the design space over which this function will be representative of the system unavailability. The thesis concludes with attention to various optimisation techniques, which possess features able to address difficulties in the optimisation of safety critical systems. Specifically, consideration is given to the use of a statistically designed experiment and a logical search approach

    Synthetic Aperture Radar (SAR) Meets Deep Learning

    Get PDF
    This reprint focuses on the application of the combination of synthetic aperture radars and depth learning technology. It aims to further promote the development of SAR image intelligent interpretation technology. A synthetic aperture radar (SAR) is an important active microwave imaging sensor, whose all-day and all-weather working capacity give it an important place in the remote sensing community. Since the United States launched the first SAR satellite, SAR has received much attention in the remote sensing community, e.g., in geological exploration, topographic mapping, disaster forecast, and traffic monitoring. It is valuable and meaningful, therefore, to study SAR-based remote sensing applications. In recent years, deep learning represented by convolution neural networks has promoted significant progress in the computer vision community, e.g., in face recognition, the driverless field and Internet of things (IoT). Deep learning can enable computational models with multiple processing layers to learn data representations with multiple-level abstractions. This can greatly improve the performance of various applications. This reprint provides a platform for researchers to handle the above significant challenges and present their innovative and cutting-edge research results when applying deep learning to SAR in various manuscript types, e.g., articles, letters, reviews and technical reports
    corecore