2,676 research outputs found
Dynamics of critical collapse
Critical collapse of a massless scalar field in spherical symmetry is
systematically studied. We combine numerical simulations and asymptotic
analysis, and synthesize critical collapse, spacetime singularities, and
complex science. First set of approximate analytic expressions near the center
are obtained. We observe that, near the center, the spacetime is nearly
conformally flat, the dynamics is not described by the Kasner solution, and the
Kreschmann scalar is proportional to r^(-5.30), where r is the areal radius.
These features are significantly different from those in black hole
singularities. It is speculated that the scalar field in critical collapse may
be a special standing wave.Comment: Title changed. 11 pages, 8 figures, 1 tabl
Interior dynamics of neutral and charged black holes in f(R) gravity
In this paper, we explore the interior dynamics of neutral and charged black
holes in gravity. We transform gravity from the Jordan frame into
the Einstein frame and simulate scalar collapses in flat, Schwarzschild, and
Reissner-Nordstr\"om geometries. In simulating scalar collapses in
Schwarzschild and Reissner-Nordstr\"om geometries, Kruskal and Kruskal-like
coordinates are used, respectively, with the presence of and a physical
scalar field being taken into account. The dynamics in the vicinities of the
central singularity of a Schwarzschild black hole and of the inner horizon of a
Reissner-Nordstr\"om black hole is examined. Approximate analytic solutions for
different types of collapses are partially obtained. The scalar degree of
freedom , transformed from , plays a similar role as a physical
scalar field in general relativity. Regarding the physical scalar field in
case, when is negative (positive), the physical scalar field
is suppressed (magnified) by , where is the coordinate time. For dark
energy gravity, inside black holes, gravity can easily push to .
Consequently, the Ricci scalar becomes singular, and the numerical
simulation breaks down. This singularity problem can be avoided by adding an
term to the original function, in which case an infinite Ricci
scalar is pushed to regions where is also infinite. On the other hand, in
collapse for this combined model, a black hole, including a central
singularity, can be formed. Moreover, under certain initial conditions,
and can be pushed to infinity as the central singularity is approached.
Therefore, the classical singularity problem, which is present in general
relativity, remains in collapse for this combined model.Comment: 35 pages, 22 figures. (Special Issue. Modified Gravity Cosmology:
From Inflation to Dark Energy). Minor change. arXiv admin note: substantial
text overlap with arXiv:1507.0180
Differential Recurrent Neural Networks for Action Recognition
The long short-term memory (LSTM) neural network is capable of processing
complex sequential information since it utilizes special gating schemes for
learning representations from long input sequences. It has the potential to
model any sequential time-series data, where the current hidden state has to be
considered in the context of the past hidden states. This property makes LSTM
an ideal choice to learn the complex dynamics of various actions.
Unfortunately, the conventional LSTMs do not consider the impact of
spatio-temporal dynamics corresponding to the given salient motion patterns,
when they gate the information that ought to be memorized through time. To
address this problem, we propose a differential gating scheme for the LSTM
neural network, which emphasizes on the change in information gain caused by
the salient motions between the successive frames. This change in information
gain is quantified by Derivative of States (DoS), and thus the proposed LSTM
model is termed as differential Recurrent Neural Network (dRNN). We demonstrate
the effectiveness of the proposed model by automatically recognizing actions
from the real-world 2D and 3D human action datasets. Our study is one of the
first works towards demonstrating the potential of learning complex time-series
representations via high-order derivatives of states
CapProNet: Deep Feature Learning via Orthogonal Projections onto Capsule Subspaces
In this paper, we formalize the idea behind capsule nets of using a capsule
vector rather than a neuron activation to predict the label of samples. To this
end, we propose to learn a group of capsule subspaces onto which an input
feature vector is projected. Then the lengths of resultant capsules are used to
score the probability of belonging to different classes. We train such a
Capsule Projection Network (CapProNet) by learning an orthogonal projection
matrix for each capsule subspace, and show that each capsule subspace is
updated until it contains input feature vectors corresponding to the associated
class. We will also show that the capsule projection can be viewed as
normalizing the multiple columns of the weight matrix simultaneously to form an
orthogonal basis, which makes it more effective in incorporating novel
components of input features to update capsule representations. In other words,
the capsule projection can be viewed as a multi-dimensional weight
normalization in capsule subspaces, where the conventional weight normalization
is simply a special case of the capsule projection onto 1D lines. Only a small
negligible computing overhead is incurred to train the network in
low-dimensional capsule subspaces or through an alternative hyper-power
iteration to estimate the normalization matrix. Experiment results on image
datasets show the presented model can greatly improve the performance of the
state-of-the-art ResNet backbones by and that of the Densenet by
respectively at the same level of computing and memory expenses. The
CapProNet establishes the competitive state-of-the-art performance for the
family of capsule nets by significantly reducing test errors on the benchmark
datasets.Comment: Liheng Zhang, Marzieh Edraki, Guo-Jun Qi. CapProNet: Deep Feature
Learning via Orthogonal Projections onto Capsule Subspaces, in Proccedings of
Thirty-second Conference on Neural Information Processing Systems (NIPS
2018), Palais des Congr\`es de Montr\'eal, Montr\'eal, Canda, December 3-8,
201
- …