323,164 research outputs found

    Inverse scattering of 2d photonic structures by layer-stripping

    Full text link
    Design and reconstruction of 2d and 3d photonic structures are usually carried out by forward simulations combined with optimization or intuition. Reconstruction by means of layer-stripping has been applied in seismic processing as well as in design and characterization of 1d photonic structures such as fiber Bragg gratings. Layer-stripping is based on causality, where the earliest scattered light is used to recover the structure layer-by-layer. Our set-up is a 2d layered nonmagnetic structure probed by plane polarized harmonic waves entering normal to the layers. It is assumed that the dielectric permittivity in each layer only varies orthogonal to the polarization. Based on obtained reflectance data covering a suitable frequency interval, time-localized pulse data are synthesized and applied to reconstruct the refractive index profile in the leftmost layer by identifying the local, time-domain Fresnel reflection at each point. Once the first layer is known, its impact on the reflectance data is stripped off, and the procedure repeated for the next layer. Through numerical simulations it will be demonstrated that it is possible to reconstruct structures consisting of several layers. The impact of evanescent modes and limited bandwidth is discussed

    Algorithm Engineering in Robust Optimization

    Full text link
    Robust optimization is a young and emerging field of research having received a considerable increase of interest over the last decade. In this paper, we argue that the the algorithm engineering methodology fits very well to the field of robust optimization and yields a rewarding new perspective on both the current state of research and open research directions. To this end we go through the algorithm engineering cycle of design and analysis of concepts, development and implementation of algorithms, and theoretical and experimental evaluation. We show that many ideas of algorithm engineering have already been applied in publications on robust optimization. Most work on robust optimization is devoted to analysis of the concepts and the development of algorithms, some papers deal with the evaluation of a particular concept in case studies, and work on comparison of concepts just starts. What is still a drawback in many papers on robustness is the missing link to include the results of the experiments again in the design

    Management and Control of Domestic Smart Grid Technology

    Get PDF
    Emerging new technologies like distributed generation, distributed storage, and demand-side load management will change the way we consume and produce energy. These techniques enable the possibility to reduce the greenhouse effect and improve grid stability by optimizing energy streams. By smartly applying future energy production, consumption, and storage techniques, a more energy-efficient electricity supply chain can be achieved. In this paper a three-step control methodology is proposed to manage the cooperation between these technologies, focused on domestic energy streams. In this approach, (global) objectives like peak shaving or forming a virtual power plant can be achieved without harming the comfort of residents. As shown in this work, using good predictions, in advance planning and real-time control of domestic appliances, a better matching of demand and supply can be achieved.\ud \u

    Exact ICL maximization in a non-stationary temporal extension of the stochastic block model for dynamic networks

    Full text link
    The stochastic block model (SBM) is a flexible probabilistic tool that can be used to model interactions between clusters of nodes in a network. However, it does not account for interactions of time varying intensity between clusters. The extension of the SBM developed in this paper addresses this shortcoming through a temporal partition: assuming interactions between nodes are recorded on fixed-length time intervals, the inference procedure associated with the model we propose allows to cluster simultaneously the nodes of the network and the time intervals. The number of clusters of nodes and of time intervals, as well as the memberships to clusters, are obtained by maximizing an exact integrated complete-data likelihood, relying on a greedy search approach. Experiments on simulated and real data are carried out in order to assess the proposed methodology

    3D Structure of Microwave Sources from Solar Rotation Stereoscopy vs Magnetic Extrapolations

    Full text link
    We use rotation stereoscopy to estimate the height of a steady-state solar feature relative to the photosphere, based on its apparent motion in the image plane recorded over several days of observation. The stereoscopy algorithm is adapted to work with either one- or two-dimensional data (i.e. from images or from observations that record the projected position of the source along an arbitrary axis). The accuracy of the algorithm is tested on simulated data, and then the algorithm is used to estimate the coronal radio source heights associated with the active region NOAA 10956, based on multifrequency imaging data over 7 days from the Siberian Solar Radio Telescope near 5.7 GHz, the Nobeyama Radio Heliograph at 17 GHz, as well as one-dimensional scans at multiple frequencies spanning the 5.98--15.95 GHz frequency range from the RATAN-600 instrument. The gyroresonance emission mechanism, which is sensitive to the coronal magnetic field strength, is applied to convert the estimated radio source heights at various frequencies, h(f), to information about magnetic field vs. height B(h), and the results are compared to a magnetic field extrapolation derived from photospheric magnetic field observations obtained by Hinode and MDI. We found that the gyroresonant emission comes from the heights exceeding location of the third gyrolayer irrespectively on the magnetic extrapolation method; implications of this finding for the coronal magnetography and coronal plasma physics are discussed.Comment: 26 pages, 13 figures, ApJ accepte

    Approximated Computation of Belief Functions for Robust Design Optimization

    Get PDF
    This paper presents some ideas to reduce the computational cost of evidence-based robust design optimization. Evidence Theory crystallizes both the aleatory and epistemic uncertainties in the design parameters, providing two quantitative measures, Belief and Plausibility, of the credibility of the computed value of the design budgets. The paper proposes some techniques to compute an approximation of Belief and Plausibility at a cost that is a fraction of the one required for an accurate calculation of the two values. Some simple test cases will show how the proposed techniques scale with the dimension of the problem. Finally a simple example of spacecraft system design is presented.Comment: AIAA-2012-1932 14th AIAA Non-Deterministic Approaches Conference. 23-26 April 2012 Sheraton Waikiki, Honolulu, Hawai

    Analysis of Break-Points in Financial Time Series

    Get PDF
    A time series is a set of random values collected at equal time intervals; this randomness makes these types of series not easy to predict because the structure of the series may change at any time. As discussed in previous research, the structure of time series may change at any time due to the change in mean and/or variance of the series. Consequently, based on this structure, it is wise not to assume that these series are stationary. This paper, discusses, a method of analyzing time series by considering the entire series non-stationary, assuming there is random change in unconditional mean and variance of the series. Specifically, this paper emphasizes financial time series. The main goal in this process is to break the series into small locally stationary time series on which stationary assumption applies. The most interesting part of this procedure is locating the break-points, where the unconditional mean and/or variance of the series change. After having found what the break-points are, we divide the series into smaller series according to the break points; the number of break-points determines how many small stationary time series we have. The analysis by this method considers each interval on which the series is stationary as an independent time series with its specific parameters. Hence, the overall time series that is naturally nonstationary is broken into small stationary time series that are easier to analyze. Afterwards, by using Bayesian Information Criterion (BIC) we are comparing the local stationary model to the model considering the entire series stationary. In a simulation study with known sample size, unconditional means and variances, for each small stationary series, the result shows that we can locate the exact true break-points when the sample size is greater than 500. After our simulation study, this method is also applied to the real data, S&P 500 series of returns, which is a financial time series. The results obtained by using Maximum Likelihood Estimation (MLE) show that BIC is smaller for the locally stationary model
    corecore