1,388 research outputs found

    Proceedings of SIRM 2023 - The 15th European Conference on Rotordynamics

    Get PDF
    It was our great honor and pleasure to host the SIRM Conference after 2003 and 2011 for the third time in Darmstadt. Rotordynamics covers a huge variety of different applications and challenges which are all in the scope of this conference. The conference was opened with a keynote lecture given by Rainer Nordmann, one of the three founders of SIRM “Schwingungen in rotierenden Maschinen”. In total 53 papers passed our strict review process and were presented. This impressively shows that rotordynamics is relevant as ever. These contributions cover a very wide spectrum of session topics: fluid bearings and seals; air foil bearings; magnetic bearings; rotor blade interaction; rotor fluid interactions; unbalance and balancing; vibrations in turbomachines; vibration control; instability; electrical machines; monitoring, identification and diagnosis; advanced numerical tools and nonlinearities as well as general rotordynamics. The international character of the conference has been significantly enhanced by the Scientific Board since the 14th SIRM resulting on one hand in an expanded Scientific Committee which meanwhile consists of 31 members from 13 different European countries and on the other hand in the new name “European Conference on Rotordynamics”. This new international profile has also been emphasized by participants of the 15th SIRM coming from 17 different countries out of three continents. We experienced a vital discussion and dialogue between industry and academia at the conference where roughly one third of the papers were presented by industry and two thirds by academia being an excellent basis to follow a bidirectional transfer what we call xchange at Technical University of Darmstadt. At this point we also want to give our special thanks to the eleven industry sponsors for their great support of the conference. On behalf of the Darmstadt Local Committee I welcome you to read the papers of the 15th SIRM giving you further insight into the topics and presentations

    Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5

    Get PDF
    This ïŹfth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different ïŹelds of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered. First Part of this book presents some theoretical advances on DSmT, dealing mainly with modiïŹed Proportional ConïŹ‚ict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classiïŹers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes. Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identiïŹcation of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classiïŹcation. Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classiïŹcation, and hybrid techniques mixing deep learning with belief functions as well

    Toward Fault-Tolerant Applications on Reconfigurable Systems-on-Chip

    Get PDF
    L'abstract Ăš presente nell'allegato / the abstract is in the attachmen

    Three essays on credit supply

    Get PDF
    This thesis consists of three independent essays on credit supply, each addressing different components, including the different impact of credit supply shocks financed through different supply channels, how different credit constraints impact debt structure and productivity, and how it affects their individual and collective exposure over time. Chapter 1: Its conceptual appeal has made the Conditional Value at Risk (CoVaR) one of the most influential systemic risk indicators. Despite its popularity, an outstanding methodological challenge may hamper the CoVaRs’ accuracy in measuring the time-series dimension of systemic risk. The dynamics of the CoVaR are entirely due to the behaviour of the state variables and therefore without their inclusion, the CoVaR would be constant over time. The key contribution of this chapter is to relax the assumption of time-invariant tail dependence between the financial system and each institution’s losses, by allowing the estimated parameters of the model to change over time, in addition to changing over quantiles and different financial institutions. We find that the dynamic component that we introduce does not affect the estimations for the risk of individual financial institutions, but it largely affects estimations of systemic risk which exhibits more procyclicality than the one implied by the standard CoVaR. As expected, larger financial institutions have a higher effect on systemic risk, although they are also shown to be individually more robust. When adding balance sheet data, it introduces additional volatility into our model relative to the standard one. In terms of forecasting, the results depend on the horizon used or the variables included. There is no clear outperformance between either model when we add the balance sheet data, or in the short term (less than 12 weeks). However, our model outperforms the standard one for medium (between 15 and 25 weeks) to long term horizons (between 30 and 40 weeks). Chapter 2: We seek to evaluate the impact of the different segments within the lending sector to the private non-financial sector can have on subsequent GDP growth. We isolate the bank lending channel as one of the main components, and group the remaining ones into a second segment which we classify as market based finance (MBF). We also include the 2 different segments of the borrowing sector, household debt and non-financial firm debt, to compare with the results obtained by the standard model. We debate the main source of these effects, and focus on either credit demand or credit supply shocks, in addition to other alternatives. We find that a rise in bank credit and/or household debt to GDP ratio lowers subsequent GDP growth. The predictive power is large in magnitude and robust across time and space. The bank credit booms and household debt booms are connected to lower interest rate spread environments, as well as periods with better financial conditions. And although the overall impact on subsequent GDP growth is negative, we found contrasting evidence when using the Financial Conditions Index (FCI) as an instrument. This would point to the potential different effects that bank credit and household debt could have on future economic growth (good booms vs bad booms), depending on the underlying cause of the boom. The results and the evidence that we found are more consistent with models where the fundamental source of the changes in household debt or bank credit lie in changes in the credit supply (credit supply shocks), rather than credit demand or other possibilities. This would likely be connected to incorrect expectations formation by lenders and investors (what many authors classify as “credit market sentiment” in the literature), which is an important element in explaining shifts in credit supply. Although credit demand shocks could play an important role in prolonging or amplifying the effects of the booms, it is unlikely that they are the source, as it would lead to results that conflict with empirical evidence. Finally, we find some differences in terms of statistical significance and magnitude in the different scenarios, where the bank credit shows more robustness to different specifications than the household debt. This would imply that there is a significance of the bank credit that goes well beyond the household debt. It would also mean that the main component that generates the boom bust cycle in GDP would be the bank credit, independent of its destination, rather than household debt, independent of its financing. Chapter 3: We construct a dataset at the firm-year level by merging the syndicated loan data, provided by Refinitiv LPC DealScan ("DealScan"), with the firm level data, provided by Center for Research in Security Prices (CRSP)/Compustat Merged Database ("CCM"). We conduct an analysis on firms subjected to different covenants, and find that firms with earnings-based constraints have lower levels of TFP (Total Factor Productivity), and short-term debt, when compared to firms with asset-based constraints. The data also shows that this is connected to an additional negative impact that short-term debt has on the productivity for the firms with earnings based constraints, which does not verify in the firms with asset-based constraints. Both these characteristics are robust to the use of 3 different TFP estimation methods, different subsamples, and additional controls, including age and size of the firm. Thus, we consider a quantitative dynamic stochastic partial equilibrium model, with three main types of firms, distinguished by their constraints, which explores the impact of short-term and long term borrowing on firm’s balance sheets, on the different variables. We construct replications for this theoretical model, and assess the how well it fits our actual data. Our findings show that constraints exert an impact on short-term borrowing, but not on the remaining variables. More specifically, firms that face an earnings-based constraint show lower levels of short-term borrowing, compared with firms that are either unconstrained, or asset-based constraint. The adjustment is made through lower dividend distribution, as can be seen by the lower values of the value function. They also point to the impact being larger for firms with lower productivity shocks, which is in accordance withour empirical findings. Even though that our data shows differences in some of this variables (for example, on long-term debt), these were not robust to some of the controls, including the size of the firm

    Mathematical Methods and Operation Research in Logistics, Project Planning, and Scheduling

    Get PDF
    In the last decade, the Industrial Revolution 4.0 brought flexible supply chains and flexible design projects to the forefront. Nevertheless, the recent pandemic, the accompanying economic problems, and the resulting supply problems have further increased the role of logistics and supply chains. Therefore, planning and scheduling procedures that can respond flexibly to changed circumstances have become more valuable both in logistics and projects. There are already several competing criteria of project and logistic process planning and scheduling that need to be reconciled. At the same time, the COVID-19 pandemic has shown that even more emphasis needs to be placed on taking potential risks into account. Flexibility and resilience are emphasized in all decision-making processes, including the scheduling of logistic processes, activities, and projects

    A FPGA-based architecture for real-time cluster finding in the LHCb silicon pixel detector

    Get PDF
    The data acquisition system of the LHCb experiment has been substantially upgraded for the LHC Run 3, with the unprecedented capability of reading out and fully reconstructing all proton–proton collisions in real time, occurring with an average rate of 30 MHz, for a total data flow of approximately 32 Tb/s. The high demand of computing power required by this task has motivated a transition to a hybrid heterogeneous computing architecture, where a farm of graphics cores, GPUs, is used in addition to general–purpose processors, CPUs, to speed up the execution of reconstruction algorithms. In a continuing effort to improve real–time processing capabilities of this new DAQ system, also with a view to further luminosity increases in the future, low–level, highly–parallelizable tasks are increasingly being addressed at the earliest stages of the data acquisition chain, using special–purpose computing accelerators. A promising solution is offered by custom–programmable FPGA devices, that are well suited to perform high–volume computations with high throughput and degree of parallelism, limited power consumption and latency. In this context, a two–dimensional FPGA–friendly cluster–finder algorithm has been developed to reconstruct hit positions in the new vertex pixel detector (VELO) of the LHCb Upgrade experiment. The associated firmware architecture, implemented in VHDL language, has been integrated within the VELO readout, without the need for extra cards, as a further enhancement of the DAQ system. This pre–processing allows the first level of the software trigger to accept a 11% higher rate of events, as the ready– made hit coordinates accelerate the track reconstruction, while leading to a drop in electrical power consumption, as the FPGA implementation requires O(50x) less power than the GPU one. The tracking performance of this novel system, being indistinguishable from a full–fledged software implementation, allows the raw pixel data to be dropped immediately at the readout level, yielding the additional benefit of a 14% reduction in data flow. The clustering architecture has been commissioned during the start of LHCb Run 3 and it currently runs in real time during physics data taking, reconstructing VELO hit coordinates on–the–fly at the LHC collision rate

    Lattice Boltzmann Methods for Partial Differential Equations

    Get PDF
    Lattice Boltzmann methods provide a robust and highly scalable numerical technique in modern computational fluid dynamics. Besides the discretization procedure, the relaxation principles form the basis of any lattice Boltzmann scheme and render the method a bottom-up approach, which obstructs its development for approximating broad classes of partial differential equations. This work introduces a novel coherent mathematical path to jointly approach the topics of constructability, stability, and limit consistency for lattice Boltzmann methods. A new constructive ansatz for lattice Boltzmann equations is introduced, which highlights the concept of relaxation in a top-down procedure starting at the targeted partial differential equation. Modular convergence proofs are used at each step to identify the key ingredients of relaxation frequencies, equilibria, and moment bases in the ansatz, which determine linear and nonlinear stability as well as consistency orders of relaxation and space-time discretization. For the latter, conventional techniques are employed and extended to determine the impact of the kinetic limit at the very foundation of lattice Boltzmann methods. To computationally analyze nonlinear stability, extensive numerical tests are enabled by combining the intrinsic parallelizability of lattice Boltzmann methods with the platform-agnostic and scalable open-source framework OpenLB. Through upscaling the number and quality of computations, large variations in the parameter spaces of classical benchmark problems are considered for the exploratory indication of methodological insights. Finally, the introduced mathematical and computational techniques are applied for the proposal and analysis of new lattice Boltzmann methods. Based on stabilized relaxation, limit consistent discretizations, and consistent temporal filters, novel numerical schemes are developed for approximating initial value problems and initial boundary value problems as well as coupled systems thereof. In particular, lattice Boltzmann methods are proposed and analyzed for temporal large eddy simulation, for simulating homogenized nonstationary fluid flow through porous media, for binary fluid flow simulations with higher order free energy models, and for the combination with Monte Carlo sampling to approximate statistical solutions of the incompressible Euler equations in three dimensions
    • 

    corecore