12,393 research outputs found
Some Blow-Up Problems for a Semilinear Parabolic Equation with a Potential
The blow-up rate estimate for the solution to a semilinear parabolic equation
in with 0-Dirichlet
boundary condition is obtained. As an application, it is shown that the
asymptotic behavior of blow-up time and blow-up set of the problem with
nonnegative initial data u(x,0)=M\vf (x) as goes to infinity, which have
been found in \cite{cer}, are improved under some reasonable and weaker
conditions compared with \cite{cer}.Comment: 29 page
A quasi-monotonicity formula and partial regularity for borderline solutions to a parabolic equation
Self-Supervised Representation Learning with Cross-Context Learning between Global and Hypercolumn Features
Whilst contrastive learning yields powerful representations by matching
different augmented views of the same instance, it lacks the ability to capture
the similarities between different instances. One popular way to address this
limitation is by learning global features (after the global pooling) to capture
inter-instance relationships based on knowledge distillation, where the global
features of the teacher are used to guide the learning of the global features
of the student. Inspired by cross-modality learning, we extend this existing
framework that only learns from global features by encouraging the global
features and intermediate layer features to learn from each other. This leads
to our novel self-supervised framework: cross-context learning between global
and hypercolumn features (CGH), that enforces the consistency of instance
relations between low- and high-level semantics. Specifically, we stack the
intermediate feature maps to construct a hypercolumn representation so that we
can measure instance relations using two contexts (hypercolumn and global
feature) separately, and then use the relations of one context to guide the
learning of the other. This cross-context learning allows the model to learn
from the differences between the two contexts. The experimental results on
linear classification and downstream tasks show that our method outperforms the
state-of-the-art methods
Self-Supervised Representation Learning with Cross-Context Learning between Global and Hypercolumn Features
Whilst contrastive learning yields powerful representations by matching different augmented views of the same instance, it lacks the ability to capture the similarities between different instances. One popular way to address this limitation is by learning global features (after the global pooling) to capture inter-instance relationships based on knowledge distillation, where the global features of the teacher are used to guide the learning of the global features of the student. Inspired by cross-modality learning, we extend this existing framework that only learns from global features by encouraging the global features and intermediate layer features to learn from each other. This leads to our novel self-supervised framework: cross-context learning between global and hypercolumn features (CGH), that enforces the consistency of instance relations between lowand high-level semantics. Specifically, we stack the intermediate feature maps to construct a “hypercolumn” representation so that we can measure instance relations using two contexts (hypercolumn and global feature) separately, and then use the relations of one context to guide the learning of the other. This cross-context learning allows the model to learn from the differences between the two contexts. The experimental results on linear classification and downstream tasks show that our method outperforms the state-of-the-art methods
Phase formation of polycrystalline MgB2 at low temperature using nanometer Mg powder
The MgB2 superconductor synthesized in a flowing argon atmosphere using
nanometer magnesium powder as the raw materials, denoted as Nano-MgB2, has been
studied by the technique of in-situ high temperature resistance measurement
(HT-RT measurement). The MgB2 phase is identified to form within the
temperature range of 430 to 490 C, which is much lower than that with the MgB2
sample fabricated in the same gas environment using the micron-sized magnesium
powder, denoted as Micro-MgB2, reported previously. The sample density of the
Nano-MgB2 reaches 1.7 g/cm3 with a crystal porosity structure less than a
micrometer, as determined by the scanning electron microscope (SEM) images,
while the Micro-MgB2 has a much more porous structure with corresponding
density of 1.0 g/cm3. This indicates that the Mg raw particle size, besides the
sintering temperature, is a crucial factor for the formation of high density
MgB2 sample, even at the temperature much lower than that of the Mg melting,
650 C. The X-ray diffraction (XRD) pattern shows a good MgB2 phase with small
amount of MgO and Mg and the transition temperature, TC, of the Nano-MgB2 was
determined as 39 K by the temperature dependent magnetization measurement
(M-T), indicating the existence of a good superconducting property.Comment: 10 pages, 4 figure, Solid State Communicatio
- …