7 research outputs found
Derivation of the time dependent Gross–Pitaevskii equation in two dimensions
We present microscopic derivations of the defocusing two-dimensional cubic nonlinear Schrödinger equation and the Gross–Pitaevskii equation starting froman interacting N-particle system of bosons. We consider the interaction potential to be given either by Wβ(x)=N−1+2βW(Nβx), for any β>0, or to be given by VN(x)=e2NV(eNx), for some spherical symmetric, nonnegative and compactly supported W,V∈L∞(R2,R). In both cases we prove the convergence of the reduced density corresponding to the exact time evolution to the projector onto the solution of the corresponding nonlinear Schrödinger equation in trace norm. For the latter potential VN we show that it is crucial to take the microscopic structure of the condensate into account in order to obtain the correct dynamics
Derivation of the time dependent Gross–Pitaevskii equation in two dimensions
We present microscopic derivations of the defocusing two-dimensional cubic nonlinear Schrödinger equation and the Gross–Pitaevskii equation starting froman interacting N-particle system of bosons. We consider the interaction potential to be given either by Wβ(x)=N−1+2βW(Nβx), for any β>0, or to be given by VN(x)=e2NV(eNx), for some spherical symmetric, nonnegative and compactly supported W,V∈L∞(R2,R). In both cases we prove the convergence of the reduced density corresponding to the exact time evolution to the projector onto the solution of the corresponding nonlinear Schrödinger equation in trace norm. For the latter potential VN we show that it is crucial to take the microscopic structure of the condensate into account in order to obtain the correct dynamics
H2O Open Ecosystem for State-of-the-art Large Language Models
Large Language Models (LLMs) represent a revolution in AI. However, they also
pose many significant risks, such as the presence of biased, private,
copyrighted or harmful text. For this reason we need open, transparent and safe
solutions. We introduce a complete open-source ecosystem for developing and
testing LLMs. The goal of this project is to boost open alternatives to
closed-source approaches. We release h2oGPT, a family of fine-tuned LLMs of
diverse sizes. We also introduce H2O LLM Studio, a framework and no-code GUI
designed for efficient fine-tuning, evaluation, and deployment of LLMs using
the most recent state-of-the-art techniques. Our code and models are fully
open-source. We believe this work helps to boost AI development and make it
more accessible, efficient and trustworthy. The demo is available at:
https://gpt.h2o.ai/Comment: EMNLP 2023 Demo - ACL Empirical Methods in Natural Language
Processin