1,583 research outputs found
Ineffective Controls on Capital Inflows Under Sophisticated Financial Markets: Brazil in the Nineties
We analyze the Brazilian experience in the 1990s to assess the effectiveness of controls on capital inflows in restricting financial inflows and changing their composition towards long term flows. Econometric exercises (VARs) showed that controls on capital inflows were effective in deterring financial inflows for only a brief period, from two to six months. The hypothesis to explain the ineffectiveness of the controls is that financial institutions performed several operations aimed at avoiding capital controls. To check this hypothesis, we conducted interviews with market players. We collected several examples of the financial strategies engineered to avoid the capital controls and invest in the Brazilian fixed income market. The main conclusion is that controls on capital inflows, while they may be desirable, are of very limited effectiveness under sophisticated financial markets.
Stabilization of vortex beams in Kerr media by nonlinear absorption
We elaborate a new solution for the problem of stable propagation of
transversely localized vortex beams in homogeneous optical media with
self-focusing Kerr nonlinearity. Stationary nonlinear Bessel-vortex states are
stabilized against azimuthal breakup and collapse by multiphoton absorption,
while the respective power loss is offset by the radial influx of the power
from an intrinsic reservoir. A linear stability analysis and direct numerical
simulations reveal a region of stability of these vortices. Beams with multiple
vorticities have their stability regions too. These beams can then form robust
tubular filaments in transparent dielectrics as common as air, water and
optical glasses at sufficiently high intensities. We also show that the
tubular, rotating and speckle-like filamentation regimes, previously observed
in experiments with axicon-generated Bessel beams, can be explained as
manifestations of the stability or instability of a specific nonlinear
Bessel-vortex state, which is fully identified.Comment: Physical Review A, in press, 9 pages, 6 figure
Judging traffic differentiation as network neutrality violation according to internet regulation
Network Neutrality (NN) is a principle that establishes that traffic generated by Internet applications should be treated equally and it should not be affected by arbitrary interfer- ence, degradation, or interruption. Despite this common sense, NN has multiple defi- nitions spread across the academic literature, which differ primarily on what constitutes the proper equality level to consider the network as neutral. NN definitions may also be included in regulations that control activities on the Internet. However, the regulations are set by regulators whose acts are valid within a geographical area, named jurisdic- tion. Thus, both the academia and regulations provide multiple and heterogeneous NN definitions. In this thesis, the regulations are used as guidelines to detect NN violations, which are, by this approach, the adoption of traffic management practices prohibited by regulators. Thereafter, the solutions can provide helpful information for users to support claims against illegal traffic management practices. However, state-of-the-art solutions adopt strict academic definitions (e.g., all traffic must be treated equally) or adopt the regulatory definitions from one jurisdiction, which is not realistic or does not consider that multiple jurisdictions may be traversed in an end-to-end network path, respectively An impact analysis showed that, under certain circumstances, from 39% to 48% of the detected Traffic Differentiations (TDs) are not NN violations when the regulations are considered, exposing that the regulatory aspect must not be ignored. In this thesis, a Reg- ulation Assessment step is proposed to be performed after the TD detection. This step shall consider all NN definitions that may be found in an end-to-end network path and point out NN violation when they are violated. A service is proposed to perform this step for TD detection solutions, given the unfeasibility of every solution implementing the re- quired functionalities. A Proof-of-Concept (PoC) prototype was developed based on the requirements identified along with the impact analysis, which was evaluated using infor- mation about TDs detected by a state-of-the-art solution. The verdicts were inconclusive (the TD is an NN violation or not) for a quarter of the scenarios due to lack of information about the traversed network paths and the occurrence zones (where in the network path, the TD is suspected of being deployed). However, the literature already has proposals of approaches to obtain such information. These results should encourage TD detection solution proponents to collect this data and submit them for the Regulation Assessment.Neutralidade da rede (NR) é um princípio que estabelece que o tráfego de aplicações e serviços seja tratado igualitariamente e não deve ser afetado por interferência, degradação, ou interrupção arbitrária. Apesar deste senso comum, NR tem múltiplas definições na literatura acadêmica, que diferem principalmente no que constitui o nível de igualdade adequado para considerar a rede como neutra. As definições de NR também podem ser incluídas nas regulações que controlam as atividades na Internet. No entanto, tais regu- lações são definidas por reguladores cujos atos são válidos apenas dentro de uma área geográfica denominada jurisdição. Assim, tanto a academia quanto a regulação forne- cem definições múltiplas e heterogêneas de NR. Nesta tese, a regulação é utilizada como guia para detecção de violação da NR, que nesta abordagem, é a adoção de práticas de gerenciamento de tráfego proibidas pelos reguladores. No entanto, as soluções adotam definições estritas da academia (por exemplo, todo o tráfego deve ser tratado igualmente) ou adotam as definições regulatórias de uma jurisdição, o que pode não ser realista ou pode não considerar que várias jurisdições podem ser atravessadas em um caminho de rede, respectivamente. Nesta tese, é proposta uma etapa de Avaliação da Regulação após a detecção da Diferenciação de Tráfego (DT), que deve considerar todas as definições de NR que podem ser encontradas em um caminho de rede e sinalizar violações da NR quando elas forem violadas. Uma análise de impacto mostrou que, em determinadas cir- cunstâncias, de 39% a 48% das DTs detectadas não são violações quando a regulação é considerada. É proposto um serviço para realizar a etapa de Avaliação de Regulação, visto que seria inviável que todas as soluções tivessem que implementar tal etapa. Um protótipo foi desenvolvido e avaliado usando informações sobre DTs detectadas por uma solução do estado-da-arte. Os veredictos foram inconclusivos (a DT é uma violação ou não) para 75% dos cenários devido à falta de informações sobre os caminhos de rede percorridos e sobre onde a DT é suspeita de ser implantada. No entanto, existem propostas para realizar a coleta dessas informações e espera-se que os proponentes de soluções de detecção de DT passem a coletá-las e submetê-las para o serviço de Avaliação de Regulação
Ineffective controls on capital inflows under sophisticated financial markets: Brazil in the nineties
We analyze the Brazilian experience in the 1990s to access the effectiveness of controls on capital inflows in restricting financial inflows and changing their composition towards long term flows. Econometric exercises (VARs) lead us to conclude that controls on capital inflows were effective in deterring financial inflows for only a brief period, from two to six months. The hypothesis to explain the ineffectiveness of the controls is that financial institutions performed several operations aimed at avoiding capital controls. We then conducted interviews with market players in order to provide several examples of the financial strategies that were used in this period to invest in the Brazilian fixed income market while bypassing capital controls. The main conclusion is that controls on capital inflows, while they may be desirable, are of very limited effectiveness under sophisticated financial markets. Therefore, policy-makers should avoid spending the scarce resources of bank supervision trying to implement them and focus more in improving economic policy.
Extracting nuclear matter properties from neutron star matter EoS using deep neural networks
The extraction of the nuclear matter properties from neutron star
observations is nowadays an important issue, in particular, the properties that
characterize the symmetry energy which are essential to describe correctly
asymmetric nuclear matter. We use deep neural networks (DNN) to map the
relation between cold -equilibrium neutron star matter and the nuclear
matter properties. Assuming a quadratic dependence on the isospin asymmetry for
the energy per particle of homogeneous nuclear matter and using a Taylor
expansion up to fourth order in the iso-scalar and iso-vector contributions, we
generate a dataset of different realizations of -equilibrium NS matter
and the corresponding nuclear matter properties. The DNN model was successfully
trained, attaining great accuracy in the test set. Finally, a real case
scenario was used to test the DNN model, where a set of 33 nuclear models,
obtained within a relativistic mean field approach or a Skyrme force
description, were fed into the DNN model and the corresponding nuclear matter
parameters recovered with considerable accuracy, in particular, the standard
deviations MeV and MeV were obtained, respectively, for the slope of the symmetry energy
and the nuclear matter incompressibility at saturation.Comment: 10 pages, 5 figure
From NS observations to nuclear matter properties: a machine learning approach
This study is devoted to the inference problem of extracting the nuclear
matter properties directly from a set of mass-radius observations. We employ
Bayesian neural networks (BNNs), which is a probabilistic model capable of
estimating the uncertainties associated with its predictions. To simulate
different noise levels on the observations, we create three different
sets of mock data. Our results show BNNs as an accurate and reliable tool for
predicting the nuclear matter properties whenever the true values are not
completely outside the training dataset statistics, i.e., if the model is not
heavily dependent on its extrapolating capacities. Using real mass-radius
pulsar data, the model predicted, for instance,
MeV and MeV ( interval). Our study
provides a valuable inference framework when new NS data becomes available.Comment: 15 pages, 12 figure
Decoding Neutron Star Observations: Revealing Composition through Bayesian Neural Networks
We exploit the great potential offered by Bayesian Neural Networks (BNNs) to
directly decipher the internal composition of neutron stars (NSs) based on
their macroscopic properties. By analyzing a set of simulated observations,
namely NS radius and tidal deformability, we leverage BNNs as effective tools
for inferring the proton fraction and sound speed within NS interiors. To
achieve this, several BNNs models were developed upon a dataset of 25K
nuclear EoS within a relativistic mean-field framework, obtained through
Bayesian inference that adheres to minimal low-density constraints. Unlike
conventional neural networks, BNNs possess an exceptional quality: they provide
a prediction uncertainty measure. To simulate the inherent imperfections
present in real-world observations, we have generated four distinct training
and testing datasets that replicate specific observational uncertainties. Our
initial results demonstrate that BNNs successfully recover the composition with
reasonable levels of uncertainty. Furthermore, using mock data prepared with
the DD2, a different class of relativistic mean-field model utilized during
training, the BNN model effectively retrieves the proton fraction and speed of
sound for neutron star matter.Comment: 16 pages, 15 figures, published versio
- …