53 research outputs found

    The first pieces of the gravitational-wave progenitor population puzzle

    Get PDF
    The field of gravitational wave astronomy is rapidly unfolding; between the start and finish of this thesis, the Gravitational-Wave Transient catalog has grown from order 10 to almost 100 merging double compact objects (pairs of merging black holes and neutron stars). The larger sample size has, for the first time, allowed us to infer properties of the entire population, rather than just individual sources. The observed population properties serve as the initial pieces of the “progenitor population puzzle”, that provides new insight into the question: ‘How do merging double compact objects form?’In this thesis, we set out to use the first pieces of this puzzle to form a picture of the massive stellar progenitors that give rise to merging double compact objects. We apply a combination of numerical population synthesis models, and analytical models to build intuition for the complex phenomena involved.Specifically, we show why the locations of features in the mass distribution of merging binary black holes are especially promising to constrain the physics of isolated binary stars. With this knowledge in mind, we explore features related to both the smallest-, and the largest-mass black holes formed from massive stars. We provide the first theoretical explanation for why lower-mass double compact objects (i.e., binary neutron stars) form through different formation channels than higher-mass systems (i.e., binary black holes), and produce testable results for the evolution of the merger rate with redshift. We conclude by looking ahead to the observational landscape of the next 20 years

    Synthetic Aperture Radar (SAR) Meets Deep Learning

    Get PDF
    This reprint focuses on the application of the combination of synthetic aperture radars and depth learning technology. It aims to further promote the development of SAR image intelligent interpretation technology. A synthetic aperture radar (SAR) is an important active microwave imaging sensor, whose all-day and all-weather working capacity give it an important place in the remote sensing community. Since the United States launched the first SAR satellite, SAR has received much attention in the remote sensing community, e.g., in geological exploration, topographic mapping, disaster forecast, and traffic monitoring. It is valuable and meaningful, therefore, to study SAR-based remote sensing applications. In recent years, deep learning represented by convolution neural networks has promoted significant progress in the computer vision community, e.g., in face recognition, the driverless field and Internet of things (IoT). Deep learning can enable computational models with multiple processing layers to learn data representations with multiple-level abstractions. This can greatly improve the performance of various applications. This reprint provides a platform for researchers to handle the above significant challenges and present their innovative and cutting-edge research results when applying deep learning to SAR in various manuscript types, e.g., articles, letters, reviews and technical reports

    Tradition and Innovation in Construction Project Management

    Get PDF
    This book is a reprint of the Special Issue 'Tradition and Innovation in Construction Project Management' that was published in the journal Buildings

    Automatic characterization and generation of music loops and instrument samples for electronic music production

    Get PDF
    Repurposing audio material to create new music - also known as sampling - was a foundation of electronic music and is a fundamental component of this practice. Currently, large-scale databases of audio offer vast collections of audio material for users to work with. The navigation on these databases is heavily focused on hierarchical tree directories. Consequently, sound retrieval is tiresome and often identified as an undesired interruption in the creative process. We address two fundamental methods for navigating sounds: characterization and generation. Characterizing loops and one-shots in terms of instruments or instrumentation allows for organizing unstructured collections and a faster retrieval for music-making. The generation of loops and one-shot sounds enables the creation of new sounds not present in an audio collection through interpolation or modification of the existing material. To achieve this, we employ deep-learning-based data-driven methodologies for classification and generation.Repurposing audio material to create new music - also known as sampling - was a foundation of electronic music and is a fundamental component of this practice. Currently, large-scale databases of audio offer vast collections of audio material for users to work with. The navigation on these databases is heavily focused on hierarchical tree directories. Consequently, sound retrieval is tiresome and often identified as an undesired interruption in the creative process. We address two fundamental methods for navigating sounds: characterization and generation. Characterizing loops and one-shots in terms of instruments or instrumentation allows for organizing unstructured collections and a faster retrieval for music-making. The generation of loops and one-shot sounds enables the creation of new sounds not present in an audio collection through interpolation or modification of the existing material. To achieve this, we employ deep-learning-based data-driven methodologies for classification and generation

    The first pieces of the gravitational-wave progenitor population puzzle

    Get PDF
    The field of gravitational wave astronomy is rapidly unfolding; between the start and finish of this thesis, the Gravitational-Wave Transient catalog has grown from order 10 to almost 100 merging double compact objects (pairs of merging black holes and neutron stars). The larger sample size has, for the first time, allowed us to infer properties of the entire population, rather than just individual sources. The observed population properties serve as the initial pieces of the “progenitor population puzzle”, that provides new insight into the question: ‘How do merging double compact objects form?’In this thesis, we set out to use the first pieces of this puzzle to form a picture of the massive stellar progenitors that give rise to merging double compact objects. We apply a combination of numerical population synthesis models, and analytical models to build intuition for the complex phenomena involved.Specifically, we show why the locations of features in the mass distribution of merging binary black holes are especially promising to constrain the physics of isolated binary stars. With this knowledge in mind, we explore features related to both the smallest-, and the largest-mass black holes formed from massive stars. We provide the first theoretical explanation for why lower-mass double compact objects (i.e., binary neutron stars) form through different formation channels than higher-mass systems (i.e., binary black holes), and produce testable results for the evolution of the merger rate with redshift. We conclude by looking ahead to the observational landscape of the next 20 years

    Numerical methods for control-based continuation of relaxation oscillations

    Get PDF
    This is the final version. Available on open access from Springer via the DOI in this recordData Availability Statement: Data sharing is not applicable to this article, as no datasets were generated or analysed during the current study.Control-based continuation (CBC) is an experimental method that can reveal stable and unstable dynamics of physical systems. It extends the path-following principles of numerical continuation to experiments and provides systematic dynamical analyses without the need for mathematical modelling. CBC has seen considerable success in studying the bifurcation structure of mechanical systems. Nevertheless, the method is not practical for studying relaxation oscillations. Large numbers of Fourier modes are required to describe them, and the length of the experiment significantly increases when many Fourier modes are used, as the system must be run to convergence many times. Furthermore, relaxation oscillations often arise in autonomous systems, for which an appropriate phase constraint is required. To overcome these challenges, we introduce an adaptive B-spline discretisation that can produce a parsimonious description of responses that would otherwise require many Fourier modes. We couple this to a novel phase constraint that phase-locks control target and solution phase. Results are demonstrated on simulations of a slow-fast synthetic gene network and an Oregonator model. Our methods extend CBC to a much broader range of systems than have been studied so far, opening up a range of novel experimental opportunities on slow-fast systems.Engineering and Physical Sciences Research Council (EPSRC)European Union Horizon 2020Royal Academy of Engineering (RAE

    Computer Aided Verification

    Get PDF
    This open access two-volume set LNCS 13371 and 13372 constitutes the refereed proceedings of the 34rd International Conference on Computer Aided Verification, CAV 2022, which was held in Haifa, Israel, in August 2022. The 40 full papers presented together with 9 tool papers and 2 case studies were carefully reviewed and selected from 209 submissions. The papers were organized in the following topical sections: Part I: Invited papers; formal methods for probabilistic programs; formal methods for neural networks; software Verification and model checking; hyperproperties and security; formal methods for hardware, cyber-physical, and hybrid systems. Part II: Probabilistic techniques; automata and logic; deductive verification and decision procedures; machine learning; synthesis and concurrency. This is an open access book

    Z-Numbers-Based Approach to Hotel Service Quality Assessment

    Get PDF
    In this study, we are analyzing the possibility of using Z-numbers for measuring the service quality and decision-making for quality improvement in the hotel industry. Techniques used for these purposes are based on consumer evalu- ations - expectations and perceptions. As a rule, these evaluations are expressed in crisp numbers (Likert scale) or fuzzy estimates. However, descriptions of the respondent opinions based on crisp or fuzzy numbers formalism not in all cases are relevant. The existing methods do not take into account the degree of con- fidence of respondents in their assessments. A fuzzy approach better describes the uncertainties associated with human perceptions and expectations. Linguis- tic values are more acceptable than crisp numbers. To consider the subjective natures of both service quality estimates and confidence degree in them, the two- component Z-numbers Z = (A, B) were used. Z-numbers express more adequately the opinion of consumers. The proposed and computationally efficient approach (Z-SERVQUAL, Z-IPA) allows to determine the quality of services and iden- tify the factors that required improvement and the areas for further development. The suggested method was applied to evaluate the service quality in small and medium-sized hotels in Turkey and Azerbaijan, illustrated by the example

    Sustainable Construction Engineering and Management

    Get PDF
    This Book is a Printed Edition of the Special Issue which covers sustainability as an emerging requirement in the fields of construction management, project management and engineering. We invited authors to submit their theoretical or experimental research articles that address the challenges and opportunities for sustainable construction in all its facets, including technical topics and specific operational or procedural solutions, as well as strategic approaches aimed at the project, company or industry level. Central to developments are smart technologies and sophisticated decision-making mechanisms that augment sustainable outcomes. The Special Issue was received with great interest by the research community and attracted a high number of submissions. The selection process sought to balance the inclusion of a broad representative spread of topics against research quality, with editors and reviewers settling on thirty-three articles for publication. The Editors invite all participating researchers and those interested in sustainable construction engineering and management to read the summary of the Special Issue and of course to access the full-text articles provided in the Book for deeper analyses

    Probabilistic and Fuzzy Approaches for Estimating the Life Cycle Costs of Buildings

    Get PDF
    The Life cycle cost (LCC) method makes it possible for the whole life performance of buildings and other structures to be optimized. The introduction of the idea of thinking in terms of a building life cycle resulted in the need to use appropriate tools and techniques for assessing and analyzing costs throughout the life cycle of the building. Traditionally, estimates of LCC have been calculated based on historical analysis of data and have used deterministic models. The concepts of probability theory can also be applied to life cycle costing, treating the costs and timings as a stochastic process. If any subjectivity is introduced into the estimates, then the uncertainty cannot be handled using the probability theory alone. The theory of fuzzy sets is a valuable tool for handling such uncertainties. In this Special Issue, a collection of 11 contributions provide an updated overview of the approaches for estimating the life cycle cost of buildings
    corecore