482 research outputs found

    Proceedings of SIRM 2023 - The 15th European Conference on Rotordynamics

    Get PDF
    It was our great honor and pleasure to host the SIRM Conference after 2003 and 2011 for the third time in Darmstadt. Rotordynamics covers a huge variety of different applications and challenges which are all in the scope of this conference. The conference was opened with a keynote lecture given by Rainer Nordmann, one of the three founders of SIRM “Schwingungen in rotierenden Maschinen”. In total 53 papers passed our strict review process and were presented. This impressively shows that rotordynamics is relevant as ever. These contributions cover a very wide spectrum of session topics: fluid bearings and seals; air foil bearings; magnetic bearings; rotor blade interaction; rotor fluid interactions; unbalance and balancing; vibrations in turbomachines; vibration control; instability; electrical machines; monitoring, identification and diagnosis; advanced numerical tools and nonlinearities as well as general rotordynamics. The international character of the conference has been significantly enhanced by the Scientific Board since the 14th SIRM resulting on one hand in an expanded Scientific Committee which meanwhile consists of 31 members from 13 different European countries and on the other hand in the new name “European Conference on Rotordynamics”. This new international profile has also been emphasized by participants of the 15th SIRM coming from 17 different countries out of three continents. We experienced a vital discussion and dialogue between industry and academia at the conference where roughly one third of the papers were presented by industry and two thirds by academia being an excellent basis to follow a bidirectional transfer what we call xchange at Technical University of Darmstadt. At this point we also want to give our special thanks to the eleven industry sponsors for their great support of the conference. On behalf of the Darmstadt Local Committee I welcome you to read the papers of the 15th SIRM giving you further insight into the topics and presentations

    Beam scanning by liquid-crystal biasing in a modified SIW structure

    Get PDF
    A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium

    Digital agriculture: research, development and innovation in production chains.

    Get PDF
    Digital transformation in the field towards sustainable and smart agriculture. Digital agriculture: definitions and technologies. Agroenvironmental modeling and the digital transformation of agriculture. Geotechnologies in digital agriculture. Scientific computing in agriculture. Computer vision applied to agriculture. Technologies developed in precision agriculture. Information engineering: contributions to digital agriculture. DIPN: a dictionary of the internal proteins nanoenvironments and their potential for transformation into agricultural assets. Applications of bioinformatics in agriculture. Genomics applied to climate change: biotechnology for digital agriculture. Innovation ecosystem in agriculture: Embrapa?s evolution and contributions. The law related to the digitization of agriculture. Innovating communication in the age of digital agriculture. Driving forces for Brazilian agriculture in the next decade: implications for digital agriculture. Challenges, trends and opportunities in digital agriculture in Brazil

    Digital agriculture: research, development and innovation in production chains.

    Get PDF
    Digital transformation in the field towards sustainable and smart agriculture. Digital agriculture: definitions and technologies. Agroenvironmental modeling and the digital transformation of agriculture. Geotechnologies in digital agriculture. Scientific computing in agriculture. Computer vision applied to agriculture. Technologies developed in precision agriculture. Information engineering: contributions to digital agriculture. DIPN: a dictionary of the internal proteins nanoenvironments and their potential for transformation into agricultural assets. Applications of bioinformatics in agriculture. Genomics applied to climate change: biotechnology for digital agriculture. Innovation ecosystem in agriculture: Embrapa?s evolution and contributions. The law related to the digitization of agriculture. Innovating communication in the age of digital agriculture. Driving forces for Brazilian agriculture in the next decade: implications for digital agriculture. Challenges, trends and opportunities in digital agriculture in Brazil.Translated by Beverly Victoria Young and Karl Stephan Mokross

    Applications of artificial intelligence to alchemical free energy calculations in contemporary drug design

    Get PDF
    The work presented in this thesis resides at the interface of alchemical free energy methods (AFE) and machine-learning (ML) in the context of computer-aided drug discovery (CADD). The majority of the work consists of explorations into regions of synergy between the individual parts. The overarching hypothesis behind this work is that although areas of high potential exist for standalone ML and AFE in CADD, an additional source of value can be found in areas where ML and AFE are combined in such a way that the new methodology profits from key strengths in either part. Physics-based AFE calculations have - over several decades - grown into precise and accurate sub-kcal·mol−1 (in terms of mean absolute error versus experimental measures) methods of predicting ligand-protein binding affinities which is the main driver of its popularity in project support in drug design workflows. Data-driven ML methods have seen a similar rapid development spurred by the exponential growth in computational hardware capabilities, but are generally still lacking in accuracy versus experimental measures of binding affinities to support drug design work. Contrastingly, however, the first relies mainly on physical rules in the form of statistical mechanics and the latter profits from interpolating signals within large training domains of data. After a historical and theoretical introduction into drug discovery, AFE calculations and ML methods, the thesis will highlight several studies that reflect the above hypothesis along multiple key points in the AFE workflow. Firstly, a methodology that combines AFE with ML has been developed to compute accurate absolute hydration free energies. The hybrid AFE/ML methodology was trained on a subset of the FreeSolv database, and retrospectively shown to outperform most submissions from the SAMPL4 competition. Compared to pure machine-learning approaches, AFE/ML yields more precise estimates of free energies of hydration, and requires a fraction of the training set size to outperform standalone AFE calculations. The ML-derived correction terms are further shown to be transferable to a range of related AFE simulation protocols. The approach may be used to inexpensively improve the accuracy of AFE calculations, and to flag molecules which will benefit the most from bespoke force field parameterisation efforts. Secondly, early investigations into data-driven AFE network generators has been performed. Because AFE calculations make use of alchemical transformations between ligands in congeneric series, practitioners are required to estimate an optimal combination of transformations for each series. AFE networks constitute the collection of edges chosen such that all ligands (nodes) are included in the network and where each edge is a AFE calculation. As there are a vast number of possible configurations for such networks this step in AFE setup suffers from several shortcomings such as scalability and transferability between AFE softwares. Although AFE network generation has been automated in the past, the algorithm depends mostly on expert-driven estimation of AFE transformation reliabilities. This work presents a first iteration of a data-driven alternative to the state-of-the-art using a graph siamese neural network architecture. A novel dataset, RBFE Space, is presented as a representative and transferable training domain for AFE ML research. The workflow presented in this thesis matches state-of-the-art AFE network generation performance with several key benefits. The workflow provides full transferability of the network generator because RBFE-Space is open-sourced and ready to be applied to other AFE softwares. Additionally, the deep learning model represents the first robust ML predictor of transformation reliabilities in AFE calculations. Finally, one major shortcoming of AFE calculations is its decreased reliability for transformations that are larger than ∌5 heavy atoms. The work reported in this thesis describes investigations into whether running charge, Van der Waals and bond parameter transformations individually (with variable λ allocation per step) offers an advantage to transforming all parameters in a single step, as is the current standard in most AFE workflows. Initial results in this work qualitatively suggest that the bound leg benefits from a MultiStep protocol over a onestep (”SoftCore”) protocol, whereas the free leg does not show benefit. Further work was performed by Cresset that showed no observable benefit of the MultiStep approach over the Softcore approach. Several key findings are reported in this work that illustrate the benefits of dissecting an FEP approach and comparing the two approaches side-by-side

    1-D broadside-radiating leaky-wave antenna based on a numerically synthesized impedance surface

    Get PDF
    A newly-developed deterministic numerical technique for the automated design of metasurface antennas is applied here for the first time to the design of a 1-D printed Leaky-Wave Antenna (LWA) for broadside radiation. The surface impedance synthesis process does not require any a priori knowledge on the impedance pattern, and starts from a mask constraint on the desired far-field and practical bounds on the unit cell impedance values. The designed reactance surface for broadside radiation exhibits a non conventional patterning; this highlights the merit of using an automated design process for a design well known to be challenging for analytical methods. The antenna is physically implemented with an array of metal strips with varying gap widths and simulation results show very good agreement with the predicted performance

    Computer-based methods of knowledge generation in science - What can the computer tell us about the world?

    Get PDF
    Der Computer hat die wissenschaftliche Praxis in fast allen Disziplinen signifikant verĂ€ndert. Neben traditionellen Quellen fĂŒr neue Erkenntnisse wie beispielsweise Beobachtungen, deduktiven Argumenten oder Experimenten, werden nun regelmĂ€ĂŸig auch computerbasierte Methoden wie ‚Computersimulationen‘ und ‚Machine Learning‘ als solche Quellen genannt. Dieser Wandel in der Wissenschaft bringt wissenschaftsphilosophische Fragen in Bezug auf diese neuen Methoden mit sich. Eine der naheliegendsten Fragen ist dabei, ob diese neuen Methoden dafĂŒr geeignet sind, als Quellen fĂŒr neue Erkenntnisse zu dienen. Dieser Frage wird in der vorliegenden Arbeit nachgegangen, wobei ein besonderer Fokus auf einem der zentralen Probleme der computerbasierten Methoden liegt: der OpazitĂ€t. Computerbasierte Methoden werden als opak bezeichnet, wenn der kausale Zusammenhang zwischen Input und Ergebnis nicht nachvollziehbar ist. Zentrale Fragen dieser Arbeit sind, ob Computersimulationen und Machine Learning Algorithmen opak sind, ob die OpazitĂ€t bei beiden Methoden von der gleichen Natur ist und ob die OpazitĂ€t verhindert, mit computerbasierten Methoden neue Erkenntnisse zu erlangen. Diese Fragen werden nah an der naturwissenschaftlichen Praxis untersucht; insbesondere die Teilchenphysik und das ATLAS-Experiment am CERN dienen als wichtige Fallbeispiele. Die Arbeit basiert auf fĂŒnf Artikeln. In den ersten beiden Artikeln werden Computersimulationen mit zwei anderen Methoden – Experimenten und Argumenten – verglichen, um sie methodologisch einordnen zu können und herauszuarbeiten, welche Herausforderungen beim Erkenntnisgewinn Computersimulationen von den anderen Methoden unterscheiden. Im ersten Artikel werden Computersimulationen und Experimente verglichen. Aufgrund der Vielfalt an Computersimulationen ist es jedoch nicht sinnvoll, einen pauschalen Vergleich mit Experimenten durchzufĂŒhren. Es werden verschiedene epistemische Aspekte herausgearbeitet, auf deren Basis der Vergleich je nach Anwendungskontext durchgefĂŒhrt werden sollte. Im zweiten Artikel wird eine von Claus Beisbart formulierte Position diskutiert, die Computersimulationen als Argumente versteht. Dieser ‚Argument View‘ beschreibt die Funktionsweise von Computersimulationen sehr gut und ermöglicht es damit, Fragen zur OpazitĂ€t und zum induktiven Charakter von Computersimulationen zu beantworten. Wie mit Computersimulationen neues Wissen erlangt werden kann, kann der Argument View alleine jedoch nicht ausreichend beantworten. Der dritte Artikel beschĂ€ftigt sich mit der Rolle von Modellen in der theoretischen Ökologie. Modelle sind zentraler Bestandteil von Computersimulationen und Machine Learning Algorithmen. Die Fragen ĂŒber die Beziehung von PhĂ€nomenen und Modellen, die hier anhand von Beispielen aus der Ökologie betrachtet werden, sind daher fĂŒr die epistemischen Fragen dieser Arbeit von zentraler Bedeutung. Der vierte Artikel bildet das Bindeglied zwischen den Themen Computersimulation und Machine Learning. In diesem Artikel werden verschiedene Arten von OpazitĂ€t definiert und Computersimulationen und Machine Learning Algorithmen anhand von Beispielen aus der Teilchenphysik daraufhin untersucht, welche Arten von OpazitĂ€t jeweils vorhanden sind. Es wird argumentiert, dass OpazitĂ€t fĂŒr den Erkenntnisgewinn mithilfe von Computer-simulationen kein prinzipielles Problem darstellt, Model-OpazitĂ€t jedoch fĂŒr Machine Learning Algorithmen eine Quelle von fundamentaler OpazitĂ€t sein könnte. Im fĂŒnften Artikel wird dieselbe Terminologie auf den Bereich von Schachcomputern angewandt. Der Vergleich zwischen einem traditionellen Schachcomputer und einem Schachcomputer, der auf einem neuronalen Netz basiert ermöglicht die Illustration der Konsequenzen der unterschiedlichen OpazitĂ€ten. Insgesamt ermöglicht die Arbeit eine methodische Einordnung von Computersimulationen und zeigt, dass sich weder mit einem Bezug auf Experimente noch auf Argumente alleine klĂ€ren lĂ€sst, wie Computersimulationen zu neuen Erkenntnissen fĂŒhren. Eine klare Definition der jeweils vorhanden OpazitĂ€ten ermöglicht eine Abgrenzung von den eng verwandten Machine Learning Algorithmen

    50 Years of quantum chromodynamics – Introduction and Review

    Get PDF

    Strong Interaction Physics at the Luminosity Frontier with 22 GeV Electrons at Jefferson Lab

    Full text link
    This document presents the initial scientific case for upgrading the Continuous Electron Beam Accelerator Facility (CEBAF) at Jefferson Lab (JLab) to 22 GeV. It is the result of a community effort, incorporating insights from a series of workshops conducted between March 2022 and April 2023. With a track record of over 25 years in delivering the world's most intense and precise multi-GeV electron beams, CEBAF's potential for a higher energy upgrade presents a unique opportunity for an innovative nuclear physics program, which seamlessly integrates a rich historical background with a promising future. The proposed physics program encompass a diverse range of investigations centered around the nonperturbative dynamics inherent in hadron structure and the exploration of strongly interacting systems. It builds upon the exceptional capabilities of CEBAF in high-luminosity operations, the availability of existing or planned Hall equipment, and recent advancements in accelerator technology. The proposed program cover various scientific topics, including Hadron Spectroscopy, Partonic Structure and Spin, Hadronization and Transverse Momentum, Spatial Structure, Mechanical Properties, Form Factors and Emergent Hadron Mass, Hadron-Quark Transition, and Nuclear Dynamics at Extreme Conditions, as well as QCD Confinement and Fundamental Symmetries. Each topic highlights the key measurements achievable at a 22 GeV CEBAF accelerator. Furthermore, this document outlines the significant physics outcomes and unique aspects of these programs that distinguish them from other existing or planned facilities. In summary, this document provides an exciting rationale for the energy upgrade of CEBAF to 22 GeV, outlining the transformative scientific potential that lies within reach, and the remarkable opportunities it offers for advancing our understanding of hadron physics and related fundamental phenomena.Comment: Updates to the list of authors; Preprint number changed from theory to experiment; Updates to sections 4 and 6, including additional figure

    Artificial Intelligence for Science in Quantum, Atomistic, and Continuum Systems

    Full text link
    Advances in artificial intelligence (AI) are fueling a new paradigm of discoveries in natural sciences. Today, AI has started to advance natural sciences by improving, accelerating, and enabling our understanding of natural phenomena at a wide range of spatial and temporal scales, giving rise to a new area of research known as AI for science (AI4Science). Being an emerging research paradigm, AI4Science is unique in that it is an enormous and highly interdisciplinary area. Thus, a unified and technical treatment of this field is needed yet challenging. This work aims to provide a technically thorough account of a subarea of AI4Science; namely, AI for quantum, atomistic, and continuum systems. These areas aim at understanding the physical world from the subatomic (wavefunctions and electron density), atomic (molecules, proteins, materials, and interactions), to macro (fluids, climate, and subsurface) scales and form an important subarea of AI4Science. A unique advantage of focusing on these areas is that they largely share a common set of challenges, thereby allowing a unified and foundational treatment. A key common challenge is how to capture physics first principles, especially symmetries, in natural systems by deep learning methods. We provide an in-depth yet intuitive account of techniques to achieve equivariance to symmetry transformations. We also discuss other common technical challenges, including explainability, out-of-distribution generalization, knowledge transfer with foundation and large language models, and uncertainty quantification. To facilitate learning and education, we provide categorized lists of resources that we found to be useful. We strive to be thorough and unified and hope this initial effort may trigger more community interests and efforts to further advance AI4Science
    • 

    corecore