480 research outputs found
A deep learning framework based on Koopman operator for data-driven modeling of vehicle dynamics
Autonomous vehicles and driving technologies have received notable attention
in the past decades. In autonomous driving systems, \textcolor{black}{the}
information of vehicle dynamics is required in most cases for designing of
motion planning and control algorithms. However, it is nontrivial for
identifying a global model of vehicle dynamics due to the existence of strong
non-linearity and uncertainty. Many efforts have resorted to machine learning
techniques for building data-driven models, but it may suffer from
interpretability and result in a complex nonlinear representation. In this
paper, we propose a deep learning framework relying on an interpretable Koopman
operator to build a data-driven predictor of the vehicle dynamics. The main
idea is to use the Koopman operator for representing the nonlinear dynamics in
a linear lifted feature space. The approach results in a global model that
integrates the dynamics in both longitudinal and lateral directions. As the
core contribution, we propose a deep learning-based extended dynamic mode
decomposition (Deep EDMD) algorithm to learn a finite approximation of the
Koopman operator. Different from other machine learning-based approaches, deep
neural networks play the role of learning feature representations for EDMD in
the framework of the Koopman operator. Simulation results in a high-fidelity
CarSim environment are reported, which show the capability of the Deep EDMD
approach in multi-step prediction of vehicle dynamics at a wide operating
range. Also, the proposed approach outperforms the EDMD method, the multi-layer
perception (MLP) method, and the Extreme Learning Machines-based EDMD
(ELM-EDMD) method in terms of modeling performance. Finally, we design a linear
MPC with Deep EDMD (DE-MPC) for realizing reference tracking and test the
controller in the CarSim environment.Comment: 12 pages, 10 figures, 1 table, and 2 algorithm
ClimateNeRF: Physically-based Neural Rendering for Extreme Climate Synthesis
Physical simulations produce excellent predictions of weather effects. Neural
radiance fields produce SOTA scene models. We describe a novel NeRF-editing
procedure that can fuse physical simulations with NeRF models of scenes,
producing realistic movies of physical phenomena inthose scenes. Our
application -- Climate NeRF -- allows people to visualize what climate change
outcomes will do to them. ClimateNeRF allows us to render realistic weather
effects, including smog, snow, and flood. Results can be controlled with
physically meaningful variables like water level. Qualitative and quantitative
studies show that our simulated results are significantly more realistic than
those from state-of-the-art 2D image editing and 3D NeRF stylization.Comment: project page: https://climatenerf.github.io
A Neural Network Ensemble approach to System Identification
We present a new algorithm for learning unknown governing equations from trajectory data, using and ensemble of neural networks. Given samples of solutions x(t) to an unknown dynamical system xË™(t) = f(t, x(t)), we approximate the function f using an ensemble of neural networks. We express the equation in integral form and use Euler method to predict the solution at every successive time step using at each iteration a different neural network as a prior for f. This procedure yields M-1 time-independent networks, where M is the number of time steps at which x(t) is observed. Finally, we obtain a single function f(t, x(t)) by neural network interpolation. Unlike our earlier work, where we numerically computed the derivatives of data, and used them as target in a Lipschitz regularized neural network to approximate f, our new method avoids numerical differentiations, which are unstable in presence of noise. We test the new algorithm on multiple examples both with and without noise in the data. We empirically show that generalization and recovery of the governing equation improve by adding a Lipschitz regularization term in our loss function and that this method improves our previous one especially in presence of noise, when numerical differentiation provides low quality target data. Finally, we compare our results with the method proposed by Raissi, et al. arXiv:1801.01236 (2018) and with SINDy
A Neural Network Ensemble approach to System Identification
We present a new algorithm for learning unknown governing equations from trajectory data, using and ensemble of neural networks. Given samples of solutions x(t) to an unknown dynamical system xË™(t) = f(t, x(t)), we approximate the function f using an ensemble of neural networks. We express the equation in integral form and use Euler method to predict the solution at every successive time step using at each iteration a different neural network as a prior for f. This procedure yields M-1 time-independent networks, where M is the number of time steps at which x(t) is observed. Finally, we obtain a single function f(t, x(t)) by neural network interpolation. Unlike our earlier work, where we numerically computed the derivatives of data, and used them as target in a Lipschitz regularized neural network to approximate f, our new method avoids numerical differentiations, which are unstable in presence of noise. We test the new algorithm on multiple examples both with and without noise in the data. We empirically show that generalization and recovery of the governing equation improve by adding a Lipschitz regularization term in our loss function and that this method improves our previous one especially in presence of noise, when numerical differentiation provides low quality target data. Finally, we compare our results with the method proposed by Raissi, et al. arXiv:1801.01236 (2018) and with SINDy
Traffic Scene Perception for Automated Driving with Top-View Grid Maps
Ein automatisiertes Fahrzeug muss sichere, sinnvolle und schnelle Entscheidungen auf Basis seiner Umgebung treffen.
Dies benötigt ein genaues und recheneffizientes Modell der Verkehrsumgebung.
Mit diesem Umfeldmodell sollen Messungen verschiedener Sensoren fusioniert, gefiltert und nachfolgenden Teilsysteme als kompakte, aber aussagekräftige Information bereitgestellt werden.
Diese Arbeit befasst sich mit der Modellierung der Verkehrsszene auf Basis von Top-View Grid Maps.
Im Vergleich zu anderen Umfeldmodellen ermöglichen sie eine frühe Fusion von Distanzmessungen aus verschiedenen Quellen mit geringem Rechenaufwand sowie eine explizite Modellierung von Freiraum.
Nach der Vorstellung eines Verfahrens zur Bodenoberflächenschätzung, das die Grundlage der Top-View Modellierung darstellt, werden Methoden zur Belegungs- und Elevationskartierung für Grid Maps auf Basis von mehreren, verrauschten, teilweise widersprüchlichen oder fehlenden Distanzmessungen behandelt.
Auf der resultierenden, sensorunabhängigen Repräsentation werden anschließend Modelle zur Detektion von Verkehrsteilnehmern sowie zur Schätzung von Szenenfluss, Odometrie und Tracking-Merkmalen untersucht.
Untersuchungen auf öffentlich verfügbaren Datensätzen und einem Realfahrzeug zeigen, dass Top-View Grid Maps durch on-board LiDAR Sensorik geschätzt und verlässlich sicherheitskritische Umgebungsinformationen wie Beobachtbarkeit und Befahrbarkeit abgeleitet werden können.
Schließlich werden Verkehrsteilnehmer als orientierte Bounding Boxen mit semantischen Klassen, Geschwindigkeiten und Tracking-Merkmalen aus einem gemeinsamen Modell zur Objektdetektion und Flussschätzung auf Basis der Top-View Grid Maps bestimmt
- …