20,640 research outputs found
Resource-efficient high-dimensional entanglement detection via symmetric projections
We introduce two families of criteria for detecting and quantifying the
entanglement of a bipartite quantum state of arbitrary local dimension. The
first is based on measurements in mutually unbiased bases and the second is
based on equiangular measurements. Both criteria give a qualitative result in
terms of the state's entanglement dimension and a quantitative result in terms
of its fidelity with the maximally entangled state. The criteria are
universally applicable since no assumptions on the state are required.
Moreover, the experimenter can control the trade-off between
resource-efficiency and noise-tolerance by selecting the number of measurements
performed. For paradigmatic noise models, we show that only a small number of
measurements are necessary to achieve nearly-optimal detection in any
dimension. The number of global product projections scales only linearly in the
local dimension, thus paving the way for detection and quantification of very
high-dimensional entanglement.Comment: 6+2 page
Quantum Mechanics Lecture Notes. Selected Chapters
These are extended lecture notes of the quantum mechanics course which I am
teaching in the Weizmann Institute of Science graduate physics program. They
cover the topics listed below. The first four chapter are posted here. Their
content is detailed on the next page. The other chapters are planned to be
added in the coming months.
1. Motion in External Electromagnetic Field. Gauge Fields in Quantum
Mechanics.
2. Quantum Mechanics of Electromagnetic Field
3. Photon-Matter Interactions
4. Quantization of the Schr\"odinger Field (The Second Quantization)
5. Open Systems. Density Matrix
6. Adiabatic Theory. The Berry Phase. The Born-Oppenheimer Approximation
7. Mean Field Approaches for Many Body Systems -- Fermions and Boson
Anuário científico da Escola Superior de Tecnologia da Saúde de Lisboa - 2021
É com grande prazer que apresentamos a mais recente edição (a 11.ª) do Anuário Científico da Escola Superior de Tecnologia da Saúde de Lisboa. Como instituição de ensino superior, temos o compromisso de promover e incentivar a pesquisa científica em todas as áreas do conhecimento que contemplam a nossa missão. Esta publicação tem como objetivo divulgar toda a produção científica desenvolvida pelos Professores, Investigadores, Estudantes e Pessoal não Docente da ESTeSL durante 2021. Este Anuário é, assim, o reflexo do trabalho árduo e dedicado da nossa comunidade, que se empenhou na produção de conteúdo científico de elevada qualidade e partilhada com a Sociedade na forma de livros, capítulos de livros, artigos publicados em revistas nacionais e internacionais, resumos de comunicações orais e pósteres, bem como resultado dos trabalhos de 1º e 2º ciclo. Com isto, o conteúdo desta publicação abrange uma ampla variedade de tópicos, desde temas mais fundamentais até estudos de aplicação prática em contextos específicos de Saúde, refletindo desta forma a pluralidade e diversidade de áreas que definem, e tornam única, a ESTeSL. Acreditamos que a investigação e pesquisa científica é um eixo fundamental para o desenvolvimento da sociedade e é por isso que incentivamos os nossos estudantes a envolverem-se em atividades de pesquisa e prática baseada na evidência desde o início dos seus estudos na ESTeSL. Esta publicação é um exemplo do sucesso desses esforços, sendo a maior de sempre, o que faz com que estejamos muito orgulhosos em partilhar os resultados e descobertas dos nossos investigadores com a comunidade científica e o público em geral. Esperamos que este Anuário inspire e motive outros estudantes, profissionais de saúde, professores e outros colaboradores a continuarem a explorar novas ideias e contribuir para o avanço da ciência e da tecnologia no corpo de conhecimento próprio das áreas que compõe a ESTeSL. Agradecemos a todos os envolvidos na produção deste anuário e desejamos uma leitura inspiradora e agradável.info:eu-repo/semantics/publishedVersio
Convergence Rate of Nonconvex Douglas-Rachford splitting via merit functions, with applications to weakly convex constrained optimization
We analyze Douglas-Rachford splitting techniques applied to solving weakly
convex optimization problems. Under mild regularity assumptions, and by the
token of a suitable merit function, we show convergence to critical points and
local linear rates of convergence. The merit function, comparable to the Moreau
envelope in Variational Analysis, generates a descent sequence, a feature that
allows us to extend to the non-convex setting arguments employed in convex
optimization. A by-product of our approach is a ADMM-like method for
constrained problems with weakly convex objective functions. When specialized
to multistage stochastic programming, the proposal yields a nonconvex version
of the Progressive Hedging algorithm that converges with linear speed. The
numerical assessment on a battery of phase retrieval problems shows promising
numerical performance of our method, when compared to existing algorithms in
the literature.Comment: 24 pages, 1 figur
Model Diagnostics meets Forecast Evaluation: Goodness-of-Fit, Calibration, and Related Topics
Principled forecast evaluation and model diagnostics are vital in fitting probabilistic models and forecasting outcomes of interest. A common principle is that fitted or predicted distributions ought to be calibrated, ideally in the sense that the outcome is indistinguishable from a random draw from the posited distribution. Much of this thesis is centered on calibration properties of various types of forecasts.
In the first part of the thesis, a simple algorithm for exact multinomial goodness-of-fit tests is proposed. The algorithm computes exact -values based on various test statistics, such as the log-likelihood ratio and Pearson\u27s chi-square. A thorough analysis shows improvement on extant methods. However, the runtime of the algorithm grows exponentially in the number of categories and hence its use is limited.
In the second part, a framework rooted in probability theory is developed, which gives rise to hierarchies of calibration, and applies to both predictive distributions and stand-alone point forecasts. Based on a general notion of conditional T-calibration, the thesis introduces population versions of T-reliability diagrams and revisits a score decomposition into measures of miscalibration, discrimination, and uncertainty. Stable and efficient estimators of T-reliability diagrams and score components arise via nonparametric isotonic regression and the pool-adjacent-violators algorithm. For in-sample model diagnostics, a universal coefficient of determination is introduced that nests and reinterprets the classical in least squares regression.
In the third part, probabilistic top lists are proposed as a novel type of prediction in classification, which bridges the gap between single-class predictions and predictive distributions. The probabilistic top list functional is elicited by strictly consistent evaluation metrics, based on symmetric proper scoring rules, which admit comparison of various types of predictions
A study of uncertainty quantification in overparametrized high-dimensional models
Uncertainty quantification is a central challenge in reliable and trustworthy
machine learning. Naive measures such as last-layer scores are well-known to
yield overconfident estimates in the context of overparametrized neural
networks. Several methods, ranging from temperature scaling to different
Bayesian treatments of neural networks, have been proposed to mitigate
overconfidence, most often supported by the numerical observation that they
yield better calibrated uncertainty measures. In this work, we provide a sharp
comparison between popular uncertainty measures for binary classification in a
mathematically tractable model for overparametrized neural networks: the random
features model. We discuss a trade-off between classification accuracy and
calibration, unveiling a double descent like behavior in the calibration curve
of optimally regularized estimators as a function of overparametrization. This
is in contrast with the empirical Bayes method, which we show to be well
calibrated in our setting despite the higher generalization error and
overparametrization
Learning disentangled speech representations
A variety of informational factors are contained within the speech signal and a single short recording of speech reveals much more than the spoken words. The best method to extract and represent informational factors from the speech signal ultimately depends on which informational factors are desired and how they will be used. In addition, sometimes methods will capture more than one informational factor at the same time such as speaker identity, spoken content, and speaker prosody.
The goal of this dissertation is to explore different ways to deconstruct the speech signal into abstract representations that can be learned and later reused in various speech technology tasks. This task of deconstructing, also known as disentanglement, is a form of distributed representation learning. As a general approach to disentanglement, there are some guiding principles that elaborate what a learned representation should contain as well as how it should function. In particular, learned representations should contain all of the requisite information in a more compact manner, be interpretable, remove nuisance factors of irrelevant information, be useful in downstream tasks, and independent of the task at hand. The learned representations should also be able to answer counter-factual questions.
In some cases, learned speech representations can be re-assembled in different ways according to the requirements of downstream applications. For example, in a voice conversion task, the speech content is retained while the speaker identity is changed. And in a content-privacy task, some targeted content may be concealed without affecting how surrounding words sound. While there is no single-best method to disentangle all types of factors, some end-to-end approaches demonstrate a promising degree of generalization to diverse speech tasks.
This thesis explores a variety of use-cases for disentangled representations including phone recognition, speaker diarization, linguistic code-switching, voice conversion, and content-based privacy masking. Speech representations can also be utilised for automatically assessing the quality and authenticity of speech, such as automatic MOS ratings or detecting deep fakes. The meaning of the term "disentanglement" is not well defined in previous work, and it has acquired several meanings depending on the domain (e.g. image vs. speech). Sometimes the term "disentanglement" is used interchangeably with the term "factorization". This thesis proposes that disentanglement of speech is distinct, and offers a viewpoint of disentanglement that can be considered both theoretically and practically
Limit theorems for non-Markovian and fractional processes
This thesis examines various non-Markovian and fractional processes---rough volatility models, stochastic Volterra equations, Wiener chaos expansions---through the prism of asymptotic analysis.
Stochastic Volterra systems serve as a conducive framework encompassing most rough volatility models used in mathematical finance. In Chapter 2, we provide a unified treatment of pathwise large and moderate deviations principles for a general class of multidimensional stochastic Volterra equations with singular kernels, not necessarily of convolution form. Our methodology is based on the weak convergence approach by Budhiraja, Dupuis and Ellis.
This powerful approach also enables us to investigate the pathwise large deviations of families of white noise functionals characterised by their Wiener chaos expansion as~
In Chapter 3, we provide sufficient conditions for the large deviations principle to hold in path space, thereby refreshing a problem left open By Pérez-Abreu (1993). Hinging on analysis on Wiener space, the proof involves describing, controlling and identifying the limit of perturbed multiple stochastic integrals.
In Chapter 4, we come back to mathematical finance via the route of Malliavin calculus. We present explicit small-time formulae for the at-the-money implied volatility, skew and curvature in a large class of models, including rough volatility models and their multi-factor versions. Our general setup encompasses both European options on a stock and VIX options. In particular, we develop a detailed analysis of the two-factor rough Bergomi model.
Finally, in Chapter 5, we consider the large-time behaviour of affine stochastic Volterra equations, an under-developed area in the absence of Markovianity.
We leverage on a measure-valued Markovian lift introduced by Cuchiero and Teichmann and the associated notion of generalised Feller property.
This setting allows us to prove the existence of an invariant measure for the lift and hence of a stationary distribution for the affine Volterra process, featuring in the rough Heston model.Open Acces
Gasificação direta de biomassa para produção de gás combustível
The excessive consumption of fossil fuels to satisfy the world necessities of
energy and commodities led to the emission of large amounts of greenhouse
gases in the last decades, contributing significantly to the greatest
environmental threat of the 21st century: Climate Change. The answer to this
man-made disaster is not simple and can only be made if distinct stakeholders
and governments are brought to cooperate and work together. This is
mandatory if we want to change our economy to one more sustainable and
based in renewable materials, and whose energy is provided by the eternal
nature energies (e.g., wind, solar). In this regard, biomass can have a main role
as an adjustable and renewable feedstock that allows the replacement of fossil
fuels in various applications, and the conversion by gasification allows the
necessary flexibility for that purpose. In fact, fossil fuels are just biomass that
underwent extreme pressures and heat for millions of years. Furthermore,
biomass is a resource that, if not used or managed, increases wildfire risks.
Consequently, we also have the obligation of valorizing and using this
resource.
In this work, it was obtained new scientific knowledge to support the
development of direct (air) gasification of biomass in bubbling fluidized bed
reactors to obtain a fuel gas with suitable properties to replace natural gas in
industrial gas burners. This is the first step for the integration and development
of gasification-based biorefineries, which will produce a diverse number of
value-added products from biomass and compete with current petrochemical
refineries in the future. In this regard, solutions for the improvement of the raw
producer gas quality and process efficiency parameters were defined and
analyzed. First, addition of superheated steam as primary measure allowed the
increase of H2 concentration and H2/CO molar ratio in the producer gas without
compromising the stability of the process. However, the measure mainly
showed potential for the direct (air) gasification of high-density biomass (e.g.,
pellets), due to the necessity of having char accumulation in the reactor bottom
bed for char-steam reforming reactions. Secondly, addition of refused derived
fuel to the biomass feedstock led to enhanced gasification products, revealing
itself as a highly promising strategy in terms of economic viability and
environmental benefits of future gasification-based biorefineries, due to the
high availability and low costs of wastes. Nevertheless, integrated techno economic and life cycle analyses must be performed to fully characterize the
process. Thirdly, application of low-cost catalyst as primary measure revealed
potential by allowing the improvement of the producer gas quality (e.g., H2 and
CO concentration, lower heating value) and process efficiency parameters with
distinct solid materials; particularly, the application of concrete, synthetic
fayalite and wood pellets chars, showed promising results. Finally, the
economic viability of the integration of direct (air) biomass gasification
processes in the pulp and paper industry was also shown, despite still lacking
interest to potential investors. In this context, the role of government policies
and appropriate economic instruments are of major relevance to increase the
implementation of these projects.O consumo excessivo de combustíveis fósseis para garantir as necessidades e
interesses da sociedade conduziu à emissão de elevadas quantidades de
gases com efeito de estufa nas últimas décadas, contribuindo
significativamente para a maior ameaça ambiental do século XXI: Alterações
Climáticas. A solução para este desastre de origem humana é de caráter
complexo e só pode ser atingida através da cooperação de todos os governos
e partes interessadas. Para isto, é obrigatória a criação de uma bioeconomia
como base de um futuro mais sustentável, cujas necessidades energéticas e
materiais sejam garantidas pelas eternas energias da natureza (e.g., vento,
sol). Neste sentido, a biomassa pode ter um papel principal como uma matéria prima ajustável e renovável que permite a substituição de combustíveis fósseis
num variado número de aplicações, e a sua conversão através da gasificação
pode ser a chave para este propósito. Afinal, na prática, os combustíveis
fósseis são apenas biomassa sujeita a elevada temperatura e pressão durante
milhões de anos. Além do mais, a gestão eficaz da biomassa é fundamental
para a redução dos riscos de incêndio florestal e, como tal, temos o dever de
utilizar e valorizar este recurso.
Neste trabalho, foi obtido novo conhecimento científico para suporte do
desenvolvimento das tecnologias de gasificação direta (ar) de biomassa em
leitos fluidizados borbulhantes para produção de gás combustível, com o
objetivo da substituição de gás natural em queimadores industriais. Este é o
primeiro passo para o desenvolvimento de biorrefinarias de gasificação, uma
potencial futura indústria que irá providenciar um variado número de produtos
de valor acrescentado através da biomassa e competir com a atual indústria
petroquímica. Neste sentido, foram analisadas várias medidas para a melhoria
da qualidade do gás produto bruto e dos parâmetros de eficiência do processo.
Em primeiro, a adição de vapor sobreaquecido como medida primária permitiu
o aumento da concentração de H2 e da razão molar H2/CO no gás produto sem
comprometer a estabilidade do processo. No entanto, esta medida somente
revelou potencial para a gasificação direta (ar) de biomassa de alta densidade
(e.g., pellets) devido à necessidade da acumulação de carbonizados no leito
do reator para a ocorrência de reações de reforma com vapor. Em segundo, a
mistura de combustíveis derivados de resíduos e biomassa residual florestal
permitiu a melhoria dos produtos de gasificação, constituindo desta forma uma
estratégia bastante promissora a nível económico e ambiental, devido à
elevada abundância e baixo custo dos resíduos urbanos. Contudo, devem ser
efetuadas análises técnico-económicas e de ciclo de vida para a completa
caraterização do processo. Em terceiro, a aplicação de catalisadores de baixo
custo como medida primária demonstrou elevado potencial para a melhoria do
gás produto (e.g., concentração de H2 e CO, poder calorífico inferior) e para o
incremento dos parâmetros de eficiência do processo; em particular, a
aplicação de betão, faialite sintética e carbonizados de pellets de madeira,
demonstrou resultados promissores. Finalmente, foi demonstrada a viabilidade
económica da integração do processo de gasificação direta (ar) de biomassa
na indústria da pasta e papel, apesar dos parâmetros determinados não serem
atrativos para potenciais investidores. Neste contexto, a intervenção dos
governos e o desenvolvimento de instrumentos de apoio económico é de
grande relevância para a implementação destes projetos.Este trabalho foi financiado pela The Navigator Company e por Fundos Nacionais através da Fundação para a Ciência e a Tecnologia (FCT).Programa Doutoral em Engenharia da Refinação, Petroquímica e Químic
- …