372 research outputs found
Can denoising diffusion probabilistic models generate realistic astrophysical fields?
Score-based generative models have emerged as alternatives to generative
adversarial networks (GANs) and normalizing flows for tasks involving learning
and sampling from complex image distributions. In this work we investigate the
ability of these models to generate fields in two astrophysical contexts: dark
matter mass density fields from cosmological simulations and images of
interstellar dust. We examine the fidelity of the sampled cosmological fields
relative to the true fields using three different metrics, and identify
potential issues to address. We demonstrate a proof-of-concept application of
the model trained on dust in denoising dust images. To our knowledge, this is
the first application of this class of models to the interstellar medium.Comment: 8 pages, 3 figures, Accepted at the Machine Learning and the Physical
Sciences workshop, NeurIPS 202
Found: a 'flaw' in the Taj dome not perfectly symmetrical, say scientists
Satellite imagery specialist Mandyam Rajani recalls almost flinching when her senior colleague Dilip Ahuja proposed she might find a flaw in one of India's most treasured architectural showpieces, the Taj Mahal.
Ahuja, unable to dismiss something he had sensed nearly three decades ago during his second visit to the Taj, asked Rajani whether she could apply her skills in analysing images to measurements of the monument's central dome
Tiny satellites mooted to watch suspicious activity along border
Small military satellites could survey troop movement and terror training bases, says NIAS Tracking troop movements across international borders, or monitoring new or existing terror training camps is not easy. Satellites passing over target areas for only a few hours can relay time-limited intelligence, while ground-based intelligence gathering cannot always be reliable.
But consider this: A series of eyes in the sky to snoop on unfriendly neighbours or hostile groups 24/7, using space-based electronic equipment on board military satellites to relay intelligence.
This is a suggestion through a study by a Bengaluru-based scientific think-tank to the Indian Space Research Organisation (ISRO). And sources in the space agency say they are giving serious thought to the plan
Cosmological Field Emulation and Parameter Inference with Diffusion Models
Cosmological simulations play a crucial role in elucidating the effect of
physical parameters on the statistics of fields and on constraining parameters
given information on density fields. We leverage diffusion generative models to
address two tasks of importance to cosmology -- as an emulator for cold dark
matter density fields conditional on input cosmological parameters
and , and as a parameter inference model that can return constraints
on the cosmological parameters of an input field. We show that the model is
able to generate fields with power spectra that are consistent with those of
the simulated target distribution, and capture the subtle effect of each
parameter on modulations in the power spectrum. We additionally explore their
utility as parameter inference models and find that we can obtain tight
constraints on cosmological parameters.Comment: 7 pages, 5 figures, Accepted at the Machine Learning and the Physical
Sciences workshop, NeurIPS 202
Differentiable Subdivision Surface Fitting
In this paper, we present a powerful differentiable surface fitting technique
to derive a compact surface representation for a given dense point cloud or
mesh, with application in the domains of graphics and CAD/CAM. We have chosen
the Loop subdivision surface, which in the limit yields the smooth surface
underlying the point cloud, and can handle complex surface topology better than
other popular compact representations, such as NURBS. The principal idea is to
fit the Loop subdivision surface not directly to the point cloud, but to the
IMLS (implicit moving least squares) surface defined over the point cloud. As
both Loop subdivision and IMLS have analytical expressions, we are able to
formulate the problem as an unconstrained minimization problem of a completely
differentiable function that can be solved with standard numerical solvers.
Differentiability enables us to integrate the subdivision surface into any deep
learning method for point clouds or meshes. We demonstrate the versatility and
potential of this approach by using it in conjunction with a differentiable
renderer to robustly reconstruct compact surface representations of
spatial-temporal sequences of dense meshes
Stellar Reddening Based Extinction Maps for Cosmological Applications
Cosmological surveys must correct their observations for the reddening of
extragalactic objects by Galactic dust. Existing dust maps, however, have been
found to have spatial correlations with the large-scale structure of the
Universe. Errors in extinction maps can propagate systematic biases into
samples of dereddened extragalactic objects and into cosmological measurements
such as correlation functions between foreground lenses and background objects
and the primordial non-gaussianity parameter . Emission-based maps are
contaminated by the cosmic infrared background, while maps inferred from
stellar-reddenings suffer from imperfect removal of quasars and galaxies from
stellar catalogs. Thus, stellar-reddening based maps using catalogs without
extragalactic objects offer a promising path to making dust maps with minimal
correlations with large-scale structure. We present two high-latitude
integrated extinction maps based on stellar reddenings, with a point spread
function of full-width half-maximum 6.1' and 15'. We employ a strict selection
of catalog objects to filter out galaxies and quasars and measure the spatial
correlation of our extinction maps with extragalactic structure. Our galactic
extinction maps have reduced spatial correlation with large scale structure
relative to most existing stellar-reddening based and emission-based extinction
maps.Comment: 21 pages, 10 figure
Prototype-Sample Relation Distillation: Towards Replay-Free Continual Learning
In Continual learning (CL) balancing effective adaptation while combating
catastrophic forgetting is a central challenge. Many of the recent
best-performing methods utilize various forms of prior task data, e.g. a replay
buffer, to tackle the catastrophic forgetting problem. Having access to
previous task data can be restrictive in many real-world scenarios, for example
when task data is sensitive or proprietary. To overcome the necessity of using
previous tasks data, in this work, we start with strong representation learning
methods that have been shown to be less prone to forgetting. We propose a
holistic approach to jointly learn the representation and class prototypes
while maintaining the relevance of old class prototypes and their embedded
similarities. Specifically, samples are mapped to an embedding space where the
representations are learned using a supervised contrastive loss. Class
prototypes are evolved continually in the same latent space, enabling learning
and prediction at any point. To continually adapt the prototypes without
keeping any prior task data, we propose a novel distillation loss that
constrains class prototypes to maintain relative similarities as compared to
new task data. This method yields state-of-the-art performance in the
task-incremental setting where we are able to outperform other methods that
both use no data as well as approaches relying on large amounts of data. Our
method is also shown to provide strong performance in the class-incremental
setting without using any stored data points
- …