249 research outputs found
The Challenge of Machine Learning in Space Weather Nowcasting and Forecasting
The numerous recent breakthroughs in machine learning (ML) make imperative to
carefully ponder how the scientific community can benefit from a technology
that, although not necessarily new, is today living its golden age. This Grand
Challenge review paper is focused on the present and future role of machine
learning in space weather. The purpose is twofold. On one hand, we will discuss
previous works that use ML for space weather forecasting, focusing in
particular on the few areas that have seen most activity: the forecasting of
geomagnetic indices, of relativistic electrons at geosynchronous orbits, of
solar flares occurrence, of coronal mass ejection propagation time, and of
solar wind speed. On the other hand, this paper serves as a gentle introduction
to the field of machine learning tailored to the space weather community and as
a pointer to a number of open challenges that we believe the community should
undertake in the next decade. The recurring themes throughout the review are
the need to shift our forecasting paradigm to a probabilistic approach focused
on the reliable assessment of uncertainties, and the combination of
physics-based and machine learning approaches, known as gray-box.Comment: under revie
Geophysics and Ocean Waves Studies
The book “Geophysics and Ocean Waves Studies” presents the collected chapters in two sections named “Geophysics” and “Ocean Waves Studies”. The first section, “Geophysics”, provides a thorough overview of using different geophysical methods including gravity, self-potential, and EM in exploration. Moreover, it shows the significance of rock physics properties and enhanced oil recovery phases during oil reservoir production. The second section, “Ocean Waves Studies”, is intended to provide the reader with a strong description of the latest developments in the physical and numerical description of wind-generated and long waves, including some new features discovered in the last few years. The section is organized with the aim to introduce the reader from offshore to nearshore phenomena including a description of wave dissipation and large-scale phenomena (i.e., storm surges and landslide-induced tsunamis). This book shall be of great interest to students, scientists, geologists, geophysicists, and the investment community
Point spread function estimation of solar surface images with a cooperative particle swarm optmization on GPUS
Orientador : Prof. Dr. Daniel WeingaertnerDissertação (mestrado) - Universidade Federal do Paraná, Setor de Ciências Exatas, Programa de Pós-Graduação em Informática. Defesa: Curitiba, 21/02/3013Bibliografia : fls. 81-86Resumo: Apresentamos um método para a estimativa da função de espalhamento pontual (PSF) de imagens de superfície solar obtidas por telescópios terrestres e corrompidas pela atmosfera. A estimativa e feita obtendo-se a fase da frente de onda usando um conjunto de imagens de curta exposto, a reconstrucão de granulado optico do objeto observado e um modelo PSF parametrizado por polinómios de Zernikes. Estimativas da fase da frente de onda e do PSF sao computados atraves da minimizacao de uma funcao de erro com um metodo de otimizacão cooperativa por nuvens de partículas (CPSO), implementados em OpenCL para tirar vantagem do ambiente altamente paralelo Um metodo de calibracao e apresentado para ajustar os parâmetros do que as unidade de processamento gráfico (GPU) provem. algoritmo para resultados de baixo custo, resultando em solidas estimativas tanto para imagens de baixa frequencia quanto para imagens de alta frequencia. Os resultados mostram que o metodo apresentado possui râpida convergencia e e robusto a degradacao causada por ruídos. Experimentos executados em uma placa NVidia Tesla C2050 computaram 100 PSFs com 50 polinómios de Zernike em " 36 minutos. Ao aumentar-se o námero de coeficientes de Zernike dez vezes, de 50 para 500, o tempo de execucão aumentou somente 17%, o que demonstra que o algoritmo proposto e pouco afetado pelo numero de Zernikes utilizado.Abstract: We present a method for estimating the point spread function (PSF) of solar surface images acquired from ground telescopes and degraded by atmosphere. The estimation is done by retrieving the wavefront phase using a set of short exposures, the speckle reconstruction of the observed object and a PSF model parametrized by Zernike polynomials. Estimates of the wavefront phase and the PSF are computed by minimizing an error function with a cooperative particle swarm optimization method (CPSO), implemented in OpenCL to take advantage of highly parallel graphical processing units (GPUs). A calibration method is presented to adjust the algorithm parameters for low cost results, providing solid estimations for both low frequency and high frequency images. Results show that the method has a fast convergence and is robust to noise degradation. Experiments run on an NVidia Tesla C2050 were able to compute 100 PSFs with 50 Zernike polynomials in " 36 minutes. The increase on the number of Zernike coefficients tenfold, from 50 to 500, caused the increase of 17% on the execution time, showing that the proposed algorithm is only slightly affected by the number of Zernikes used
Intelligent Computing: The Latest Advances, Challenges and Future
Computing is a critical driving force in the development of human
civilization. In recent years, we have witnessed the emergence of intelligent
computing, a new computing paradigm that is reshaping traditional computing and
promoting digital revolution in the era of big data, artificial intelligence
and internet-of-things with new computing theories, architectures, methods,
systems, and applications. Intelligent computing has greatly broadened the
scope of computing, extending it from traditional computing on data to
increasingly diverse computing paradigms such as perceptual intelligence,
cognitive intelligence, autonomous intelligence, and human-computer fusion
intelligence. Intelligence and computing have undergone paths of different
evolution and development for a long time but have become increasingly
intertwined in recent years: intelligent computing is not only
intelligence-oriented but also intelligence-driven. Such cross-fertilization
has prompted the emergence and rapid advancement of intelligent computing.
Intelligent computing is still in its infancy and an abundance of innovations
in the theories, systems, and applications of intelligent computing are
expected to occur soon. We present the first comprehensive survey of literature
on intelligent computing, covering its theory fundamentals, the technological
fusion of intelligence and computing, important applications, challenges, and
future perspectives. We believe that this survey is highly timely and will
provide a comprehensive reference and cast valuable insights into intelligent
computing for academic and industrial researchers and practitioners
Enhanced Detection Efficiencies and Reduced False Alarms in Searching for Gravitational Waves from Core Collapse Supernovae
A supernova is a star that flares up very suddenly and then slowly returns to its former luminosity or, explodes violently with energy erg. There are stars which are 10 times or more massive than the Sun, which usually end their lives going supernova. When there is no longer enough fuel for the fusion process in the core of the star and inward gravitational pull of the star’s great mass takes place, the star starts to explode. A series of nuclear reactions starts taking place after the star begins shrinking due to gravity. In the final phase of this gravitational collapse process, the core temperature rises to over 100 billion degrees, the core compresses and then recoils. Energy of the recoil is transferred to the envelope of the star which then explodes and produces a shock wave. The remaining mass of the original star can form a neutron star, and in case the original star is very massive, a black hole may also form.
In the post-detection era of Gravitational Wave astronomy, core collapse supernovae are an important source of signals for the Advanced LIGO detectors. Several methods have been developed and implemented to search for gravitational wave signals from the core collapse supernovae. One such recent method is based on a multi-stage, high accuracy spectral estimation to effectively achieve higher detection signal to noise ratio. The study has been further enhanced by incorporation of a convolutional neural network to significantly reduce false alarm rates. The combined pipeline is termed Multi-Layer Signal Estimation (MuLaSE) that works in an integrative manner with the coherent wave burst pipeline. This pipeline is termed ”MuLaSECC”. This thesis undertakes an extensive analysis with two families of core collapse supernova waveforms - Kuroda 2017 and the Ott 2013 - corresponding to three-dimensional, general relativistic supernova explosion models. The Kuroda waveforms have been extracted from the models with an approximate neutrino transport for three nonrotating progenitors (11.2, 15, and 40 M) using different nuclear equations of state (EOSs). Ott 2013 waveform family is a simulation with three species neutrino leakage scheme created for the post-core-bounce phase of the collapse of a non-rotating star with 27 M. The performance of the MuLaSECC method has been evaluated through receiver operating characteristics and the reconstruction of the detected signals. The MuLaSECC is found to have higher efficiency in low false alarm range and an improved reconstruction, especially in the lower frequency domain. This method has also been able to detect signals at low signal to noise ratio that was not detected by the coherent wave burst pipeline in this study
Recommended from our members
Optimal Information Theoretic Techniques for Electro-Optical Space Domain Awareness
With high and growing public interest from key stake holders in exploring and using the space domain. The tools and techniques of space domain awareness (SDA) have seen growing interest and development. The field of space domain awareness covers the methods for detecting, tracking, and characterizing the states of space objects. These tools include every step in the process for building up estimates of these states and are essential for mission planning and actualization as well as collision avoidance. This starts with the tools for data acquisition and continues on through orbit determination and state estimation. While the field of SDA has seen incredible growth in recent years, there remains significant room for improvement.
Information optimal approaches extend beyond just providing results. They take into the account the volume of information in the ingested data and provide estimates and the corresponding uncertainty. This allows information optimal approaches to build a result that uses all the available information in the data without communicating a false level of certainty in the resulting estimate.
This research discusses new methods and algorithms for information-optimal space domain awareness. The presented methods extend from data collection to state estimation. While some of the methods are generally applicable to many different scenarios, there is a focus on electro-optical data. These methods enable more and robust methods for understanding the space domain both in and beyond the near-Earth environment.</p
Quantum Computing for Space: Exploring Quantum Circuits on Programmable Nanophotonic Chips
Quantum circuits are the fundamental computing model of quantum computing. It consists of a sequence of quantum gates that act on a set of qubits to perform a specific computation. For the implementation of quantum circuits, programmable nanophotonic chips provide a promising foundation with a large number of qubits. The current study explores the possible potential of quantum circuits implemented on programmable nanophotonic chips for space technology. In the recent findings, it has been demonstrated that quantum circuits have several advantages over classical circuits, such as exponential speedups, multiple parallel computations, and compact size. Apart from this, nanophotonic chips also offer a number of advantages over traditional chips. They provide high-speed data transfer as light travels faster than electrons. Photons require less energy to transmit data than electrons, so nanophotonic chips consume less power than conventional chips. The bandwidth of nanophotonic chips is greater than that of traditional chips, so they can transfer more data simultaneously. They can be easily scaled to smaller sizes with higher densities and are more robust to extreme temperatures and radiation than classical chips. The focus of the current study is on how quantum circuits could revolutionize space technology by providing faster and more efficient computations for a variety of space-related applications. All the in-depth analysis is carried out while taking currently available state-of-the-art technologies into consideration
On Thermospheric Density and Wind Modeling Driven by Satellite Observations
The thermosphere is home to a plethora of orbiting objects ranging in size from flecks of paint to modular spacecraft with masses on the order of thousands of kilograms.
The region spans hundreds of kilometers in vertical extent, from ∼100 km where
fixed-wing flight by aerodynamic lift is unsupportable, out to ∼500-700 km, depending on solar activity, where the particle density is so sparse that the atmosphere can no longer be treated as a fluid. The thermosphere is subject to dynamical energy input from radiation and magnetic sources that make quantifying its dynamics a nontrivial endeavor. This is particularly a challenge during geomagnetic storms, where increased magnetic activity primarily at high-latitudes drives global heating, traveling atmospheric disturbances, and intense winds throughout the thermosphere.
Modeling of the neutral density and horizontal winds is a challenging endeavor for these conditions, and it is vital not only for understanding the physics of neutral atmospheres, but also for the practical purposes of improving orbit prediction, as the thermosphere is home to an increasing number of satellite missions, in addition to being the abode of astronauts.
Various atmospheric models have been constructed and developed over decades in order to model the thermosphere, with the most prominent being the empirical models Mass Spectrometer and Incoherent Scatter Radar MSIS-00, Jacchia-Bowman JB-2008, and Drag-Temperature Model DTM-2013, which are primarily used to model the neutral density, and GITM, a physics-based model capable of modeling atmospheric electrodynamics and investigating thermospheric winds.
This dissertation focuses on three important means by which the interplay between satellite measurements and atmospheric models can drive scientific development for use in satellite mission operations and model development outright. In order to reduce the empirical mode bias during storms, we created the Multifaceted Optimization Algorithm (MOA), a method to modify the drivers of the models by comparing actual and simulated orbits through the model to reduce the errors. Applying MOA to the MSIS-00 model allowed a decrease in model error from 25% to 10% in the event that was examined, and represents an easy-to-implement technique that can use publicly available two-line-element orbital data. A superposed epoch analysis of three empirical density models shows persistent storm-time overestimation by JB-2008 and underestimation DTM-2013 by MSIS-00 for more intense geomagnetic storms that may be addressed with a Dst-based calibration, and a statistical analysis of GITM horizontal winds indicates the best performance in the polar and auroral zones and difficulty capturing seasonality.
The work contained in this dissertation aims to provide techniques and analysis
tools to improve density and wind model performance, in order to support satellite mission operators and atmospheric research. Ultimately, it demonstrates that simple tools and methods can be utilized to generate significant results and scientific insight, serving to augment and supplement more computationally intensive and cost-prohibitive strategies for investigating the thermospheric environment.PHDClimate and Space Sciences and EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/169999/1/branddan_1.pd
Intelligent computing : the latest advances, challenges and future
Computing is a critical driving force in the development of human civilization. In recent years, we have witnessed the emergence of intelligent computing, a new computing paradigm that is reshaping traditional computing and promoting digital revolution in the era of big data, artificial intelligence and internet-of-things with new computing theories, architectures, methods, systems, and applications. Intelligent computing has greatly broadened the scope of computing, extending it from traditional computing on data to increasingly diverse computing paradigms such as perceptual intelligence, cognitive intelligence, autonomous intelligence, and human computer fusion intelligence. Intelligence and computing have undergone paths of different evolution and development for a long time but have become increasingly intertwined in recent years: intelligent computing is not only intelligence-oriented but also intelligence-driven. Such cross-fertilization has prompted the emergence and rapid advancement of intelligent computing
- …